with self-service analytics and AI capabilities. We work in close collaboration with stakeholders across Europe, bringing together global innovation and local context. What You Will Do As a Databricks Solution Architect, you will play a pivotal role in Axpo’s enterprise data transformation by designing and governing scalable and secure solutions on the Databricks Lakehouse platform. You will: Define … architecture standards and patterns for Databricks-based solutions across ingestion, processing, and analytics. Lead the design of performant, secure, and cost-effective Lakehouse architectures aligned with enterprise needs. Collaborate with business stakeholders, engineers, and data scientists to design end-to-end solutions that enable innovation and data-driven decision making. Guide engineering teams on implementing best practices around data modeling … as-code. Promote and govern usage of Unity Catalog for access control, lineage, and metadata management. Champion platform observability, data quality, and operational monitoring across analytics pipelines. Evaluate new Databricks features (e.g., Delta Sharing, governance enhancements) and lead their integration into platform capabilities. Establish solution review processes and mentor engineers and analysts on architectural thinking and Databricks capabilities. Support security More ❯
Join to apply for the Data Engineer (Databricks) role at Pantheon Continue with Google Continue with Google Join to apply for the Data Engineer (Databricks) role at Pantheon Pantheon has been at the forefront of private markets investing for more than 40 years, earning a reputation for providing innovative solutions covering the full lifecycle of investments, from primary fund commitments … the organization. Essential Technical Skills Experience in designing and developing data warehouse solutions. Building and configuring multistage Azure deployment pipelines Advanced SQL, Python, PySpark Azure Data Lake, Data Factory, Databricks and Functions Azure Data Lake Storage Gen2 Traditional ETL (Informatica, SSIS etc.) Azure Powershell/CLI Azure Key Vault ARM/Terraform scripts to deploy infrastructure services. Essential Experience Experience More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Somerset Bridge Group
sector, shaping data pipelines and platforms that power smarter decisions, better pricing, and sharper customer insights. The Data Engineer will design, build, and optimise scalable data pipelines within Azure Databricks, ensuring high-quality, reliable data is available to support pricing, underwriting, claims, and operational decision-making. This role is critical in modernising SBG’s cloud-based data infrastructure, ensuring compliance … regulations, and enabling AI-driven analytics and automation. By leveraging Azure-native services, such as Azure Data Factory (ADF) for orchestration, Delta Lake for ACID-compliant data storage, and Databricks Structured Streaming for real-time data processing, the Data Engineer will help unlock insights, enhance pricing accuracy, and drive innovation. The role also includes optimising Databricks query performance, implementing robust … security controls (RBAC, Unity Catalog), and ensuring enterprise-wide data reliability. Working closely with Data Architects, Pricing Teams, Data Analysts, and IT, this role will ensure our Azure Databricks data ecosystem is scalable, efficient, and aligned with business objectives. Additionally, the Data Engineer will contribute to cost optimisation, governance, and automation within Azure’s modern data platform. Key Responsibilities Data More ❯
London, England, United Kingdom Hybrid / WFH Options
Aimpoint Digital
individuals who want to drive value, work in a fast-paced environment, and solve real business problems. You are a coder who writes efficient and optimized code leveraging key Databricks features. You are a problem-solver who can deliver simple, elegant solutions as well as cutting-edge solutions that, regardless of complexity, your clients can understand, implement, and maintain. You … Strong written and verbal communication skills required Ability to manage an individual workstream independently 3+ years of experience developing and deploying ML models in any platform (Azure, AWS, GCP, Databricks etc.) Ability to apply data science methodologies and principles to real life projects Expertise in software engineering concepts and best practices Self-starter with excellent communication skills, able to work … independently, and lead projects, initiatives, and/or people Willingness to travel. Want to stand out? Consulting Experience Databricks Machine Learning Associate or Machine Learning Professional Certification. Familiarity with traditional machine learning tools such as Python, SKLearn, XGBoost, SparkML, etc. Experience with deep learning frameworks like TensorFlow or PyTorch. Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes More ❯
engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data … data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop and automate ETL workflows … or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and More ❯
London, England, United Kingdom Hybrid / WFH Options
Mirai Talent
for building scalable, reliable data pipelines, managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. Apply data modelling techniques such More ❯
SAS Viya). An ability to write complex SQL queries. Project experience using one or more of the following technologies: Tableau, Python, Power BI, Cloud (Azure, AWS, GCP, Snowflake, Databricks). Project lifecycle experience, having played a leading role in the delivery of end-to-end projects, as 14, rue Pergolèse, Paris, Francewell as a familiarity with different development methodologies More ❯
Azure Data Engineer, with the ability to consult, advise, engineer, and optimize data products and assets. Azure Data Products: Expertise in platforms like Microsoft Fabric, Azure Synapse Analytics, Azure Databricks, and supporting technologies such as Power BI, Microsoft Purview, Azure SQL Database, Azure Cosmos DB, Azure Data Lake, etc. Azure Infrastructure: Understanding core Azure concepts and tools (Networks, Identify/ More ❯
advise, engineer and optimise our customers data products and assets. Azure Data Products: Expertise in one or more of the following platforms: Microsoft Fabric, Azure Synapse Analytics and Azure Databricks, and experience in some of the supporting data technologies like Power BI, Microsoft Purview, Azure SQL Database, Azure Cosmos DB, Azure Data Lake, etc. Azure: Understand core Azure infrastructure concepts More ❯
advise, engineer and optimise our customers data products and assets. Azure Data Products: Expertise in one or more of the following platforms: Microsoft Fabric, Azure Synapse Analytics and Azure Databricks, and experience in some of the supporting data technologies like Power BI, Microsoft Purview, Azure SQL Database, Azure Cosmos DB, Azure Data Lake, etc. Azure: Understand core Azure infrastructure concepts More ❯
London, England, United Kingdom Hybrid / WFH Options
EXL
viable, and aligned with client expectations. Enterprise Solution Design : Architect and lead the delivery of large-scale data platforms (including lakes, lakehouses, and warehouses) using GCP, Cloud Storage, BigQuery, Databricks, Snowflake. Cloud Data Strategy: Own cloud migration and modernisation strategy, leveraging GCP, and tools such as Terraform, Azure DevOps, GitHub, and CI/CD pipelines. Data Modelling: Apply deep hands More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Scott Logic
Join to apply for the Lead Data Engineer role at Scott Logic Join to apply for the Lead Data Engineer role at Scott Logic Get AI-powered advice on this job and more exclusive features. We work with some of More ❯
We work with some of the UK's biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
London, England, United Kingdom Hybrid / WFH Options
Scott Logic Ltd
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Scott Logic
Social network you want to login/join with: We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly More ❯
This is an exciting contract opportunity for an SC Cleared Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure … ecosystem. Responsibilities: Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables. Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions. Drive best practices for data engineering. Help clients realise the potential of data science, machine learning, and scaled data processing within Azure/Databricks ecosystem. Mentor junior engineers and support … Take ownership for the delivery of core solution components. Support with planning, requirements refinements, and work estimation. Skills & Experiences: Proven experience designing and implementing data solutions in Azure using Databricks as a core platform. Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows. Strong coding skills in Python and SQL, with experience in developing modular, reusable code More ❯
London, England, United Kingdom Hybrid / WFH Options
ShareForce
Join to apply for the Data Engineer - Azure/DataBricks role at ShareForce Join to apply for the Data Engineer - Azure/DataBricks role at ShareForce This is an exciting contract opportunity for an SC Cleared Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of … data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem. Responsibilities Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables. Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions. Drive best practices for … data engineering. Help clients realise the potential of data science, machine learning, and scaled data processing within Azure/Databricks ecosystem. Mentor junior engineers and support their personal development. Take ownership for the delivery of core solution components. Support with planning, requirements refinements, and work estimation. Skills & Experiences Proven experience designing and implementing data solutions in Azure using Databricks as More ❯
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation : Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation : Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
during an exciting period of digital transformation. Were building a modern, enterprise-scale data platformand were looking for a Senior Data Engineer to help drive this forward using Azure, Databricks, and lakehouse architecture. If you're passionate about delivering robust, scalable data solutions that enable business insight and innovation, we want to hear from you. What You'll Be Doing … As a key member of our data engineering team, you will: Design and implement modern data architectures (lakehouse, medallion) using Azure and Databricks Develop and maintain data pipelines, quality validation tools, and lineage tracking systems Optimise and manage MS SQL and Azure SQL databases Use Azure Data Factory, Python, and cloud-native tools to build efficient ETL processes Deliver clean … CD pipelines and robust documentation What Were Looking For 4+ years experience in data engineering within enterprise environments Deep understanding of SQL, data modelling, and cloud-native tools (Azure, Databricks) Strong Python skills and experience with automation, ETL, and AI-driven data solutions Hands-on experience with lakehouse architecture, Data Lakes, and medallion design Familiarity with PowerBI, Azure Data Factory More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Fruition Group
an experienced Azure Data Engineer to play a key role in delivering the next phase of a business-critical data programme. You'll apply your technical expertise across Azure Databricks, Data Factory, Azure SQL, and SQL Server, contributing directly to the integration of financial data into a new Azure Data Platform. Responsibilities: Design, build and optimise scalable data solutions using … Azure cloud technologies. Lead the development of data pipelines in Azure Databricks, Data Factory, and T-SQL. Provide senior-level technical direction to onshore and offshore development teams. Translate data requirements into effective data engineering solutions. Collaborate closely with business stakeholders to ensure alignment with project goals. Maintain detailed documentation and ensure robust testing practices. Plan and manage progress in … Agile sprints alongside project managers and scrum masters. Requirements: Extensive experience in Azure data engineering, with strong expertise in Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensional modelling solutions. Excellent collaboration skills with a track record of working More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯