with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both More ❯
City of London, London, United Kingdom Hybrid / WFH Options
ECS
Azure services Requirements: 10+ years in cloud data engineering, with a strong focus on building scalable data pipelines Expertise in Azure Databricks, including building and managing ETL pipelines using PySpark or Scala Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure Data Lake Storage, Azure Data Factory, and Azure Synapse Analytics More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
London, England, United Kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
london, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
Key Skills: Strong SQL skills and experience with relational databases. Hands-on experience with Azure (ADF, Synapse, Data Lake) or AWS/GCP equivalents. Familiarity with scripting languages (Python, PySpark). Knowledge of data modelling and warehouse design (Kimball, Data Vault). Exposure to Power BI to support optimised data models for reporting. Agile team experience, CI/CD More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
collaboration, learning, and innovation What we're looking for Hands-on experience with the Azure Data Engineering stack (ADF, Databricks, Synapse, Data Lake) Strong skills in SQL and Python (PySpark experience is a bonus) Experience building and optimising ETL/ELT pipelines A background in Financial Services is a plus, but not essential A curious mindset and the ability More ❯
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
East London, London, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Central London / West End, London, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze ? Silver ? Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
Employment Type: Contract
Rate: Up to £0.00 per day + Flexible depending on experience
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
london (city of london), south east england, united kingdom
develop
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
london (city of london), south east england, united kingdom
MathCo
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯