ll sit at the heart of a high-performing Data & AI team. Your mission: transform raw data into actionable insight through robust pipelines, a modern data lakehouse (Azure + Databricks), and rock-solid governance. You'll work hands-on with ETL/ELT processes, medallion architecture, and CI/CD tooling-while collaborating with data scientists, IT teams, and business … leaders across Europe. What We Need From You 5+ years in Azure Data Engineering Deep knowledge of Azure Data Factory, Databricks, Python, SQL, Delta Lake Skilled in data modeling, medallion architecture, and integration via APIs, Event Hubs, ODBC, and sFTP Azure certification? Big plus. Power BI and ML experience? Even better. Fluent in Dutch & English (French is a plus) Why More ❯
architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/… ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring clear lineage, version control, and modularity. Pipeline Orchestration : Develop and manage workflow orchestration using Azure Data Factory , including setting up triggers, pipelines, and integration runtimes. System Maintenance : Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and … Ensure compliance with data security policies, data retention rules, and privacy regulations. Required Skills and Experience 5+ years of experience in data engineering or similar roles. Strong experience with Databricks , including notebooks, cluster configuration, and Delta Lake. Proficiency in dbt for transformation logic and version-controlled data modeling. Deep knowledge of Azure Data Factory , including pipeline orchestration and integration with More ❯
and translating these into technical deliverables. Motivated to expand technical skills. Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
ASDA
Impactful Projects : Design, manage, and deliver end-to-end data science and analytics projects aligned with business priorities. Model Development : Build, test, and deploy predictive and optimisation models using Databricks, Azure, and Python , incorporating best practices in MLOps and governance. Insight Generation : Translate complex datasets into accessible and actionable insights using Power BI and other visualisation tools. Stakeholder Collaboration : Partner … corporate initiatives. What You'll Need Essential Skills & Experience: Proven experience in data science, advanced analytics, or data engineering, with a track record of delivering measurable outcomes. Proficiency in Databricks, Azure, Python, SQL, and data visualisation using open source libraries. Experience with modern MLOps practices for deploying and maintaining models in production. Excellent communication skills - able to simplify technical concepts … visible, high-level impact Join a forward-thinking data team using modern tools in a cloud-native environment Flexible hybrid working with a supportive, inclusive culture Tools & tech: Azure, Databricks, Power BI, Python - continuously evolving Attractive benefits package: Competitive salary 7% Stakeholder Pension Plan 15% Asda Colleague Discount Free parking at Asda House, Leeds Clear opportunities for career growth and More ❯
Greater Manchester, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
forward-thinking organization using data to drive innovation and business performance. They’re expanding their team and are looking for a talented Data Engineer with experience in Azure and Databricks to join the team. Salary and Benefits £55,000 – £65,000 salary depending on experience 10% performance-related bonus Hybrid working model – 2 days in the Greater Manchester office Supportive … do I need to apply for the role: Solid hands-on experience with Azure data tools, including Data Factory, Data Lake, Synapse Analytics, and Azure SQL. Strong proficiency with Databricks, including Spark, Delta Lake, and notebooks. Skilled in Python and SQL for data transformation and processing. Experience with Git and modern CI/CD workflows. Strong analytical mindset and effective More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks and a large amount of interesting data this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use … engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or Azure. Experience with Linux and containerisation (e.g Docker, shell scripting). Understanding Data modelling and Data Cataloguing principles. Understanding of … end monitoring, quality checks, lineage tracking and automated alerts to ensure reliable and trustworthy data across the platform. Experience of building a data transformation framework with dbt. Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. What you’ll get in return Competitive base salary Up to 20% bonus 25 days holiday BAYE, SAYE & Performance share More ❯
with the ability to engage technical and non-technical stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack, which the head of BI will support the transition to the new architecture already in place. Qualifications/Education: Degree More ❯
with the ability to engage technical and non-technical stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack , which the head of BI will support the transition to the new architecture already in place . Qualifications/Education More ❯
ability to engage technical and non-technical stakeholders alike. The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack , which theHead of Data Engineering & Insight will support the transition to the new architecture already in place. Qualifications/Education More ❯
translating these into technical deliverables. Motivated to expand technical skills. Desirable Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
translating these into technical deliverables. Motivated to expand technical skills. Desirable Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
with the ability to engage technical and non-technical stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack , which the head of BI will support the transition to the new architecture already in place . Qualifications/Education More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Advanced proficiency in Python, SQL, PySpark, MLOps, computer vision, and NLP, Databricks, AWS, Data Connections Cluster Training and tools such as GitLab, with proven success in architecting complex AI/ML pipelines. Demonstrated ability to write clean, efficient code in Python and … technical teams, providing guidance on project direction, technical challenges, and professional development. One or more of the following certifications are desired: AWS Certified Solutions Architect or Machine Learning Specialty. Databricks Certified Machine Learning Professional. Agile/Scrum Master Certification. Specialized certifications in AI/ML tools or methodologies. Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks … preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
to detail with a willingness to document processes and changes. Effective communication skills and a collaborative mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
abilities, and the capacity to work productively in cross-functional teams while maintaining a continuous improvement mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
analytical skills and attention to detail. Preferred/Bonus Skills: Experience with Microsoft Fabric (OneLake, DirectLake, Data Warehouse, Data Activator) is a significant plus. Knowledge of Azure Synapse Analytics , Databricks , or Power BI . Experience with automated data validation or scripting (e.g., Python, PowerShell). Familiarity with CI/CD processes in data environments. Relevant certifications (e.g., Microsoft Certified: Azure More ❯
analytical skills and attention to detail. Preferred/Bonus Skills: Experience with Microsoft Fabric (OneLake, DirectLake, Data Warehouse, Data Activator) is a significant plus. Knowledge of Azure Synapse Analytics , Databricks , or Power BI . Experience with automated data validation or scripting (e.g., Python, PowerShell). Familiarity with CI/CD processes in data environments. Relevant certifications (e.g., Microsoft Certified: Azure More ❯
/ELT, Data Lakes, Warehousing, MDM, and BI. Engineering delivery practices : Knowledge of Agile, DevOps, Git, APIs, Containers, Microservices, and Data Pipeline orchestration. Broader platform experience (desirable): Familiarity with Databricks, Snowflake, Azure Data Factory, Azure Synapse, Microsoft SQL/SSIS. Certifications that are a plus: Microsoft Certified: Fabric Analytics Engineer Associate (DP-600) Microsoft Certified: Fabric Data Engineer Associate (DP More ❯
analytics user experience, unlocking growth and operational excellence. What are we looking for? Hands-on experience designing scalable, greenfield data platforms in cloud environments using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools (e.g., Informatica), and scalable data platforms. Knowledge of Azure Data More ❯
Advanced SQL, data transformation and data profiling skills. Experience of building production ETL/ELT pipelines at scale. 1-2 years of hands on experience with Azure: Data factory, Databricks, Synapse (DWH), Azure Functions, App logic and other data analytics services, including streaming. Experience with Airflow and Kubernetes. Programming languages: Python (PySpark), scripting languages like Bash. Knowledge of Git, CI More ❯
with designing efficient physical data models/schemas and developing ETL/ELT scripts Some experience developing data solutions in cloud environments such as Azure, AWS or GCP - Azure Databricks experience a bonus 30 minute video interview with the People & Operations Team 45 minute technical video interview with one of our Senior Data Engineers Final interview with our Partner, Head More ❯
basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, and transform vast datasets, ensuring data availability and quality across the organization. Write clean, efficient, and reusable Python code tailored to More ❯
and analytics, unlocking quality growth and operational excellence. What are we looking for? Hands-on experience in designing greenfield, scalable data platforms in cloud using Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python, PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure More ❯
Kirkby on Bain, England, United Kingdom Hybrid / WFH Options
ANGLIAN WATER-2
delivery pipelines if the solution is to adopt modern DevOps processes. What does it take to be an Enterprise Data Engineer? Previous strong experience in data engineering ideally using Databricks, Azure Data Factory, Spark, Python, SQL, PowerBI Strong data engineering experience atleast 3-5 years Dimensional data modelling Experience in delivering end to end BI solution from requirements, design to More ❯