for Azure services. Skills, Experience & Competencies Required Skills: Extensive experience in data engineering with Microsoft Azure. Proficiency in Azure services such as Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. Strong experience with ETL/ELT pipelines, data modelling, and data integration. Proficiency in SQL and programming languages like Python, Scala, or PowerShell. More ❯
practices. Collaborating with leadership and stakeholders to align data priorities. Qualifications and Experience: Expertise in Commercial/Procurement Analytics and SAP (S/4 Hana). Experience with Spark, Databricks, or similar tools. Strong proficiency in data modeling, SQL, NoSQL, and data warehousing. Hands-on experience with data pipelines, ETL, and big data technologies. Proficiency in cloud platforms like AWS More ❯
using both on-premises and cloud-based technologies (Azure, AWS, or GCP). You'll design scalable data architectures, including data lakes, lakehouses, and warehouses, leveraging tools such as Databricks, Snowflake, and Azure Synapse. The ideal candidate will have a deep technical background in data engineering and a passion for leading the development of best-in-class data solutions. You More ❯
London, England, United Kingdom Hybrid / WFH Options
McCabe & Barton
telco. The ideal candidate with have expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in data modelling (relational, NoSQL, dimensional) and DevOps automation (Docker, Kubernetes, Terraform, CI/CD). Skilled in designing scalable, fault More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Fashion Company - London (Tech Stack: Data Engineer, Databricks, Python, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) We're recruiting on behalf of a leading fashion brand based in London that's recognised for combining creativity with cutting-edge technology. They're on the lookout for a talented Data Engineer to join their growing data team. More ❯
/ELT, Data Lakes, Warehousing, MDM, and BI. Engineering delivery practices : Knowledge of Agile, DevOps, Git, APIs, Containers, Microservices, and Data Pipeline orchestration. Broader platform experience (desirable): Familiarity with Databricks, Snowflake, Azure Data Factory, Azure Synapse, Microsoft SQL/SSIS. Certifications that are a plus: Microsoft Certified: Fabric Analytics Engineer Associate (DP-600) Microsoft Certified: Fabric Data Engineer Associate (DP More ❯
Watford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing data pipelines and ETL workflows using Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation. Writing clean, efficient Python code for data engineering. Collaborating with data scientists and stakeholders on machine learning and More ❯
SQL skills, including data transformation and profiling. Experience building scalable production ETL/ELT pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI More ❯
analytics user experience, unlocking growth and operational excellence. What are we looking for? Hands-on experience designing scalable, greenfield data platforms in cloud environments using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools (e.g., Informatica), and scalable data platforms. Knowledge of Azure Data More ❯
Bournemouth, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient Python code for data engineering tasks. Collaborating with data scientists More ❯
Advanced SQL, data transformation and data profiling skills. Experience of building production ETL/ELT pipelines at scale. 1-2 years of hands on experience with Azure: Data factory, Databricks, Synapse (DWH), Azure Functions, App logic and other data analytics services, including streaming. Experience with Airflow and Kubernetes. Programming languages: Python (PySpark), scripting languages like Bash. Knowledge of Git, CI More ❯
Reading, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯
Hemel Hempstead, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities: Daily tasks include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Derby, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily role will include, but is not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation to ensure data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
a daily basis, your role will include, but is not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineer Associate certification. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation to ensure data availability and quality. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Hemel Hempstead, England, United Kingdom Hybrid / WFH Options
Ingentive
Associate Responsibilities Your daily responsibilities will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Stoke-on-Trent, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Coventry, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Chelmsford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯
Chester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯