solutions on the Azure Cloud ecosystem. They design, build, and operate batch and real-time data pipelines using Azure services such as Azure Synapse Analytics, Azure Data Factory, Azure DataBricks, and Event Hub. This role also involves designing, building, and operating the data layer on Azure Synapse Analytics, SQL DW, and Cosmos DB. The data engineer is proficient in Azure … Spark and SQL, Azure functions with Python, Azure Purview, and Cosmos DB. They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Preferred Education Bachelor's Degree Required Technical And Professional Expertise Commercial … as a Data Engineer or similar role, with a strong emphasis on Azure technologies. Proficiency in Azure data services (Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, Azure Databricks). Experience with data modeling, data warehousing, and big data processing (Hadoop, Spark, Kafka). Strong understanding of SQL and NoSQL databases, data modeling, and ETL/ELT processes. Proficiency More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Somerset Bridge
sector, shaping data pipelines and platforms that power smarter decisions, better pricing, and sharper customer insights. The Data Engineer will design, build, and optimise scalable data pipelines within Azure Databricks, ensuring high-quality, reliable data is available to support pricing, underwriting, claims, and operational decision-making. This role is critical in modernising SBG’s cloud-based data infrastructure, ensuring compliance … regulations, and enabling AI-driven analytics and automation. By leveraging Azure-native services, such as Azure Data Factory (ADF) for orchestration, Delta Lake for ACID-compliant data storage, and Databricks Structured Streaming for real-time data processing, the Data Engineer will help unlock insights, enhance pricing accuracy, and drive innovation. The role also includes optimising Databricks query performance, implementing robust … security controls (RBAC, Unity Catalog), and ensuring enterprise-wide data reliability. Working closely with Data Architects, Pricing Teams, Data Analysts, and IT, this role will ensure our Azure Databricks data ecosystem is scalable, efficient, and aligned with business objectives. Additionally, the Data Engineer will contribute to cost optimisation, governance, and automation within Azure’s modern data platform. Key Responsibilities Data More ❯
systems, or a related field, or equivalent experience. Proven experience (>6 years) as a Data Engineer or similar role. Experience with data pipeline, ETL, and workflow management tools (e.g., Databricks, Data Factory). Proficiency in SQL, Python, R, or Scala. Knowledge of Python libraries such as PySpark and Pandas. Experience with SQL and database management systems like MySQL, PostgreSQL, SQL More ❯
or equivalent experience. Experience and skills: Proven experience (>6 years) as a Data Engineer or in a similar role. Experience with data pipeline, ETL and workflow management tools (e.g., Databricks, Data Factory). Proficiency in programming languages such as SQL, Python, R, or Scala. Substantial knowledge and experience with Python libraries such as PySpark and Pandas. Strong experience with SQL More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
First Central Services
including data marts and visualizations in tools like PowerBI. Experience & Knowledge: Extensive experience in end-to-end cloud data solutions, preferably in Azure. Experience with big data technologies like Databricks and/or Synapse Analytics using PySpark. Solution design experience across the data lifecycle. Proficiency with Azure services such as Data Factory, Azure Functions, ADLS Gen2, Key Vault, Synapse SQL More ❯
learning and Data science applications Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all you need to do is More ❯
Blackpool, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
resolve complex data-related issues Strong analytical and problem-solving skills Strong teamwork, interpersonal and collaboration skills with colleagues and clients Desirable: Experience with Cloud ETL tools such as Databricks/Snowflake, Spark and Kafka Experience using source control tools such as GitHub or Azure DevOps Experience with Azure DevOps for CI/CD pipeline development and data operations (DataOps More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
KPMG United Kingdom
Job description Lead Data Engineer - Manager - KPMG Curve 106256 Base Location: Leeds based (Hybrid - 3 days per week in office) As a result of the work that we do, we require applicants to hold or be capable of obtaining UK More ❯
Python (PySpark) . Ingest, transform, and curate data from multiple sources into Azure Data Lake and Delta Lake formats. Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into robust technical solutions. Implement and maintain ETL/ELT pipelines using Azure Data Factory or Synapse Pipelines . … Monitor and troubleshoot production jobs and processes. Preferred Skills & Experience: Strong proficiency in SQL for data transformation and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with Delta Lake and large-scale More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
schema design. Experience architecting and building data applications using Azure, specifically Data Warehouse and/or Data Lake. Technologies: Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks, and Power BI. Experience creating low-level designs for data platform implementations. ETL pipeline development for data source integration and transformations, including documentation. Proficiency working with APIs and integrating them … into data pipelines. Strong programming skills in Python. Experience with data wrangling such as cleansing, quality enforcement, and curation (e.g., Azure Synapse notebooks, Databricks). Data modeling experience to describe data landscape, entities, and relationships. Experience migrating data from legacy systems to the cloud. Experience with Infrastructure as Code (IaC), particularly Terraform. Proficiency in developing Power BI dashboards. Strong focus More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
Stockport, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities Daily responsibilities include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Chester, England, United Kingdom Hybrid / WFH Options
Forge Holiday Group Ltd
ETL/ELT processes Exposure to Python or any other object-oriented programming languages Experience with modern data stack tools and cloud-based data warehouses like MS Fabric, Snowflake, Databricks, Teradata or AWS Experience in designing and constructing effective reports and dashboards that transform data into actionable insights with Tableau or Power BI Proven ability to manage work within set More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
ASDA
Impactful Projects : Design, manage, and deliver end-to-end data science and analytics projects aligned with business priorities. Model Development : Build, test, and deploy predictive and optimisation models using Databricks, Azure, and Python , incorporating best practices in MLOps and governance. Insight Generation : Translate complex datasets into accessible and actionable insights using Power BI and other visualisation tools. Stakeholder Collaboration : Partner … corporate initiatives. What You'll Need Essential Skills & Experience: Proven experience in data science, advanced analytics, or data engineering, with a track record of delivering measurable outcomes. Proficiency in Databricks, Azure, Python, SQL, and data visualisation using open source libraries. Experience with modern MLOps practices for deploying and maintaining models in production. Excellent communication skills - able to simplify technical concepts … visible, high-level impact Join a forward-thinking data team using modern tools in a cloud-native environment Flexible hybrid working with a supportive, inclusive culture Tools & tech: Azure, Databricks, Power BI, Python - continuously evolving Attractive benefits package: Competitive salary 7% Stakeholder Pension Plan 15% Asda Colleague Discount Free parking at Asda House, Leeds Clear opportunities for career growth and More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - FinTech Company - Newcastle (Tech Stack: Data Engineer, Databricks, Python, Azure, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) I'm working with a leading Software House in the FinTech industry, based in Newcastle, who are looking to hire a talented Data Engineer .This is a fantastic opportunity to join a forward-thinking company where you'll play … working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience is a plus but not essential. Excellent communication skills, with the ability to explain complex data concepts in a clear and More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineer Associate certification. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation to ensure data availability and quality. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯