Kent, England, United Kingdom Hybrid / WFH Options
Searchability®
Hands-on with Azure data services (Data Lake, Synapse, SQL DB, Functions, Logic Apps) Understanding of data security, access control and governance in regulated environments Proficiency in Python or PySpark for data engineering tasks TO BE CONSIDERED... Please either apply by clicking online or emailing me directly chelsea.hackett@searchability.com . By applying to this role you give express consent More ❯
maidstone, south east england, united kingdom Hybrid / WFH Options
Searchability®
Hands-on with Azure data services (Data Lake, Synapse, SQL DB, Functions, Logic Apps) Understanding of data security, access control and governance in regulated environments Proficiency in Python or PySpark for data engineering tasks TO BE CONSIDERED... Please either apply by clicking online or emailing me directly chelsea.hackett@searchability.com . By applying to this role you give express consent More ❯
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
lake and Azure Monitor providing added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
are an experienced Data Engineer within financial services environments You have expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with More ❯
Experience Active SC Clearance (mandatory at application stage). Proven expertise in: Azure Data Factory & Azure Synapse Azure DevOps & Microsoft Azure ecosystem Power BI (including semantic models) Python (incl. PySpark) and advanced SQL dbt with SQL DBs (data transformation & modelling) Dimension data modelling Terraform for infrastructure-as-code deployments Strong experience with both structured and unstructured data. Delivery track More ❯
warrington, cheshire, north west england, united kingdom
Scrumconnect Consulting
Experience Active SC Clearance (mandatory at application stage). Proven expertise in: Azure Data Factory & Azure Synapse Azure DevOps & Microsoft Azure ecosystem Power BI (including semantic models) Python (incl. PySpark) and advanced SQL dbt with SQL DBs (data transformation & modelling) Dimension data modelling Terraform for infrastructure-as-code deployments Strong experience with both structured and unstructured data. Delivery track More ❯
bolton, greater manchester, north west england, united kingdom
Scrumconnect Consulting
Experience Active SC Clearance (mandatory at application stage). Proven expertise in: Azure Data Factory & Azure Synapse Azure DevOps & Microsoft Azure ecosystem Power BI (including semantic models) Python (incl. PySpark) and advanced SQL dbt with SQL DBs (data transformation & modelling) Dimension data modelling Terraform for infrastructure-as-code deployments Strong experience with both structured and unstructured data. Delivery track More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
collaboration, learning, and innovation What we're looking for Hands-on experience with the Azure Data Engineering stack (ADF, Databricks, Synapse, Data Lake) Strong skills in SQL and Python (PySpark experience is a bonus) Experience building and optimising ETL/ELT pipelines A background in Financial Services is a plus, but not essential A curious mindset and the ability More ❯
part of our Data Engineering Team. You will not only maintain and optimise our data infrastructure but also spearhead its evolution. Built predominantly on AWS, and utilising technologies like Pyspark and Iceberg, our infrastructure is designed for scalability, robustness, and efficiency. You will be part of developing sophisticated data integrations with various platforms, developing real-time data solutions, improving More ❯
Bury, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Bolton, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Leigh, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Altrincham, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Ashton-Under-Lyne, Greater Manchester, United Kingdom Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
london (city of london), south east england, united kingdom
develop
leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯
CI/CD adoption across teams. Act as a trusted advisor, simplifying technical concepts and communicating clearly with business stakeholders. Develop and maintain data pipelines using Azure ADF, Databricks, PySpark, and Delta Lake. Build and optimise workflows in Python and SQL to support supply chain, sales, and marketing analytics. Contribute to CI/CD pipelines using GitHub Actions (or More ❯