design and architecting solutions. • Hand on experience in technology consulting, enterprise and solutions architecture and architectural frameworks, Data Modelling, Experience on ERWIN Modelling. • Hands on experience in ADF, Azure Databricks, Azure Synapse, Spark, PySpark, Python/Scala, SQL. • Hands on Experience in Designing and building Data Lake from Multiple Source systems/Data Providers. • Experience on Data Modelling, architecture, implementation More ❯
7+ years of experience in technology consulting, enterprise and solutions architecture and architectural frameworks, Data Modelling, Experience on ERWIN Modelling. 7+ years of hands-on experience in ADF, Azure Databricks, Azure Synapse, Spark, PySpark, Python/Scala, SQL. Hands on Experience in Designing and building Data Lake from Multiple Source systems/Data Providers. Experience on Data Modelling, architecture, implementation More ❯
7+ years of experience in technology consulting, enterprise and solutions architecture and architectural frameworks, Data Modelling, Experience on ERWIN Modelling. 7+ years of hands-on experience in ADF, Azure Databricks, Azure Synapse, Spark, PySpark, Python/Scala, SQL. Hands on Experience in Designing and building Data Lake from Multiple Source systems/Data Providers. Experience on Data Modelling, architecture, implementation More ❯
environment like on docker). Testing tools such as JUnit, Spock or Clojure.test. Scripting skills in Bash, Ruby, or Python. Some technologies and processes we use: Clojure, Go, Python Databricks, PySPark Kafka, Kafka Streams Relational DB, Document Stores (e.g. ElasticSearch) GCP, AWS Kubernetes, Cloud Native CI/CD, CircleCI, GitOps, FluxCD More ❯
GCP/Azure). Hands-on experience with Unix-based command line and DevOps tools (Git, Docker, Kubernetes). Hands-on experience with big data technologies (e.g. Spark, Hadoop, Databricks). Experience with coaching/mentoring other engineers. Prior experience in Management Consulting is a strong plus. Willingness to travel and work at local and international clients. Fluency in spoken More ❯
with cloud based infrastructures (AWS/GCP/Azure). Knowledge of Unix command line and DevOps tools (Git, Docker, Kubernetes). Experience with big data technologies (Spark, Hadoop, Databricks). Experience coaching/mentoring other engineers. Prior experience in management consulting is a strong plus. Willingness to travel and work at local and international clients. Fluency in English and More ❯
languages such as Python or R, with extensive experience with LLMs, ML algorithms, and models. Experience with cloud services like Azure ML Studio, Azure Functions, Azure Pipelines, MLflow, Azure Databricks, etc., is a plus. Experience working in Azure/Microsoft environments is considered a real plus. Proven understanding of data science methods for analyzing and making sense of research data More ❯
GCP/Azure). Hands-on experience with Unix-based command line and DevOps tools (Git, Docker, Kubernetes). Hands-on experience with big data technologies (e.g. Spark, Hadoop, Databricks). Experience with coaching/mentoring other engineers. Prior experience in Management Consulting is a strong plus. Willingness to travel and work at local and international clients. Fluency in spoken More ❯
GCP/Azure). Hands-on experience with Unix-based command line and DevOps tools (Git, Docker, Kubernetes). Hands-on experience with big data technologies (e.g. Spark, Hadoop, Databricks). Experience with coaching/mentoring other engineers. Prior experience in Management Consulting is a strong plus. Willingness to travel and work at local and international clients. Fluency in spoken More ❯
Reigate, Surrey, England, United Kingdom Hybrid/Remote Options
esure Group
Skilled in team collaboration and exhibits good interpersonal skills. Able to prioritise, multi-task, and deliver at pace with an iterative mindset. Experience with modern data platforms such as Databricks or Snowflake is advantageous; LLM API experience is a bonus. Additional Information What’s in it for you?: Competitive salary that reflects your skills, experience and potential. Discretionary bonus scheme More ❯
with Python data science stack Knowledge of OO programming, software design, i.e., SOLID principles, and testing practices. Knowledge and working experience of AGILE methodologies. Proficient with SQL. Familiarity with Databricks, Spark, geospatial data/modelling and insurance are a plus. Exposure to MLOps, model monitoring principles, CI/CD and associated tech, e.g., Docker, MLflow, k8s, FastAPI etc are desirable More ❯
disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar More ❯
design, access patterns, query performance optimization, etc. Experience with data pipeline technologies like AWS Glue, Airflow, Kafka, or other cloud based equivalence. Experience with ETL and data warehousing like Databricks, Snowflake, or equivalent. Container-based deployment experience using Docker and Kubernetes. Strong verbal and written communication skills. NICE TO HAVE: Experience working with or data modeling for graph databases like More ❯
etc.) 5+ years leveraging statistical or algorithmic modeling methods to answer questions and solve measurement and prediction problems. Experience working with modern data lakehouse software and structures such as Databricks and/or Snowflake Ability to breakdown multiple complex asks and deliver solutions with agility Experience designing and deploying data products as part of a modern analytics cloud platform Experience More ❯
At Databricks, our core principles are at the heart of everything we do; creating a culture of proactiveness and a customer-centric mindset guides us to create a unified platform that makes data science and analytics accessible to everyone. We aim to inspire our customers to make informed decisions that push their business forward. We provide a user-friendly and … relationships with clients throughout your assigned territory to provide technical and business value in collaboration with an Account Executive and a Senior Solutions Architect. Gain excitement from clients about Databricks through hands-on evaluation and Spark programming, integrating with the wider cloud ecosystem and 3rd party applications. Contribute to building the Databricks technical community through engagement at workshops, seminars, and … exposure to advanced proofs-of-concept and an understanding of a major public cloud platform. Experience diving deeper into solution architecture and Data Engineering. Fluent Dutch and English About DatabricksDatabricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data More ❯
passionate about clean design, collaboration, and the pursuit of data excellence. Role and Responsibility In this role, you'll develop and maintain robust data models and transformation pipelines using Databricks, Azure, and Power BI to turn complex datasets into reliable, insight-ready assets. You'll apply strong skills in SQL, Python, and PySpark to build efficient ELT workflows and ensure More ❯
SQL Development capacity Highly Desirable - Experience of Insolvency or Financial Services sectors. Technical Competencies Required: Advanced T-SQL MS SQL Server ETL and Migration Development SSIS Azure Data Factory Databricks Microsoft Fabric Advanced Microsoft Excel Highly Desirable: Azure Devops GIT Source control Jira Confluence mySQL Desirable: SSRS & Power BI SSAS SQL Utilities such as Redgate Other development experience includes C# More ❯
passionate about clean design, collaboration, and the pursuit of data excellence. Role and Responsibility In this role, you'll develop and maintain robust data models and transformation pipelines using Databricks, Azure, and Power BI to turn complex datasets into reliable, insight-ready assets. You'll apply strong skills in SQL, Python, and PySpark to build efficient ELT workflows and ensure More ❯
computer programming, data modeling, and performance optimization techniques to model and access all data types (structured and unstructured data). Experience with large-scale data sets, semantic layer solutions (Databricks, Snowflake, or similar), AWS, and Azure. Masters degree (or foreign equivalent) in computer science, electronic engineering, or related field; plus 5 years of experience in job offered or similar occupation. More ❯
OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical, organisational, and problem-solving abilities. Experience leading or mentoring More ❯
OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical, organisational, and problem-solving abilities. Experience leading or mentoring More ❯
City of London, London, United Kingdom Hybrid/Remote Options
LHH
and public sector clients (or internal clients within large organisations) through RFI/RFP responses, bid documentation, and client presentations. Hands-on experience with data science platforms such as Databricks , Dataiku , AzureML , or SageMaker , and machine learning frameworks such as TensorFlow , Keras , PyTorch , and scikit-learn . Expertise in cloud platforms ( AWS , Azure , Google Cloud ) and experience deploying solutions using More ❯
and public sector clients (or internal clients within large organisations) through RFI/RFP responses, bid documentation, and client presentations. Hands-on experience with data science platforms such as Databricks , Dataiku , AzureML , or SageMaker , and machine learning frameworks such as TensorFlow , Keras , PyTorch , and scikit-learn . Expertise in cloud platforms ( AWS , Azure , Google Cloud ) and experience deploying solutions using More ❯
design and long-term operation. Help refine and evolve engineering principles, development standards, and best practice. Skills & Experience: Hands-on experience with Azure data services such as Data Factory, Databricks, Data Lake, Azure SQL, Synapse, Data Catalog, and Purview. Strong Python and SQL skills for transformation and pipeline development. Solid understanding of Azure data storage/warehouse technologies including SQL More ❯
with Snowflake Secure Data Sharing and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking, and time travel features. Databricks: Hands-on experience with Apache Spark and Databricks Runtime. Proficiency in Delta Lake for ACID-compliant data lakes. Experience with Structured Streaming and Auto Loader. Familiarity with MLflow, Feature Store … and Model Registry. Use of Databricks notebooks for collaborative development in Python, SQL, or Scala. Successful applicants should also possess: Bachelors' degree in Data Science, Analytics, Information Technology, Computer Science, Statistics, Mathematics, Quantitative Economics, Engineering, or equivalent professional education. Minimum of 3 years' experience in a Data Engineering role with a blue-chip consulting firm or in the Data Office More ❯