in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
london (city of london), south east england, united kingdom
Mercuria
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
london (city of london), south east england, united kingdom
KPMG UK
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
Banbury, Oxfordshire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
Senior Databricks Engineer Location: Banbury or Manchester (Hybrid - 2/3 days in office) Salary: £60,000 - £75,000 per annum Contract: Permanent, Full-Time We're working with a well-established financial services organisation that's growing its data More ❯
data services – Databricks, ADF, ADLS, Power BI. Proficiency in SQL and data profiling for test design and validation. Hands-on experience with test automation frameworks such as Python/PySpark, Great Expectations, Pytest, or dbt tests. Practical understanding of CI/CD integration (Azure DevOps, GitHub Actions, or similar). Strong problem-solving skills and the ability to work More ❯
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
end (Data Factory, Synapse, Fabric, or Databricks). Strong SQL development and data modelling capability. Experience integrating ERP or legacy systems into cloud data platforms. Proficiency in Python or PySpark for transformation and automation. Understanding of data governance, access control, and security within Azure. Hands-on experience preparing data for Power BI or other analytics tools. Excellent communication skills More ❯
Sutton Coldfield, Birmingham, West Midlands (County), United Kingdom
SF Recruitment
end (Data Factory, Synapse, Fabric, or Databricks). Strong SQL development and data modelling capability. Experience integrating ERP or legacy systems into cloud data platforms. Proficiency in Python or PySpark for transformation and automation. Understanding of data governance, access control, and security within Azure. Hands-on experience preparing data for Power BI or other analytics tools. Excellent communication skills More ❯
to adapt quickly to changing environments and priorities, maintaining effectiveness in dynamic situations Proficiency using SQL Server in a highly transactional environment. Experience in either C# or Python/PySpark for data engineering or development tasks. Strong understanding of DevOps principles and experience with relevant tools e.g., Azure DevOps, Git, Terraform for CI/CD, automation, and infrastructure management. More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯