East London, London, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Altrincham, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Bolton, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Leigh, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Central London / West End, London, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Bury, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Ashton-Under-Lyne, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze ? Silver ? Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
Employment Type: Contract
Rate: Up to £0.00 per day + Flexible depending on experience
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
london (city of london), south east england, united kingdom
develop
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
london (city of london), south east england, united kingdom
MathCo
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code reviews, and More ❯