in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
london (city of london), south east england, united kingdom
Mercuria
in Computer Science, Software Engineering, or equivalent technical discipline. 8+ years of hands-on experience building large-scale distributed data pipelines and architectures. Expert-level knowledge in Apache Spark, PySpark, and Databricks—including experience with Delta Lake, Unity Catalog, MLflow, and Databricks Workflows. Deep proficiency in Python and SQL, with proven experience building modular, testable, reusable pipeline components. Strong More ❯
of software engineering experience, with recent experience in cloud data platforms Working knowledge of Databricks (e.g., Delta Lake, Unity Catalog, DLT, Auto Loader) Solid grasp of programming (Python, SQL, Pyspark, etc), data modeling, and database management (SQL/NoSQL) Applies technical knowledge to process all data formats — structured, semi-structured (e.g., JSON, Parquet), and unstructured (e.g., logs, text, clickstream More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
london (city of london), south east england, united kingdom
KPMG UK
to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills we’d love to see/Amazing Extras: Experience with Microsoft Fabric. More ❯
Banbury, Oxfordshire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
Senior Databricks Engineer Location: Banbury or Manchester (Hybrid - 2/3 days in office) Salary: £60,000 - £75,000 per annum Contract: Permanent, Full-Time We're working with a well-established financial services organisation that's growing its data More ❯
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
to adapt quickly to changing environments and priorities, maintaining effectiveness in dynamic situations Proficiency using SQL Server in a highly transactional environment. Experience in either C# or Python/PySpark for data engineering or development tasks. Strong understanding of DevOps principles and experience with relevant tools e.g., Azure DevOps, Git, Terraform for CI/CD, automation, and infrastructure management. More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
London, England, United Kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
london, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯