Integrate asynchronous workflows using message brokers such as AWS SQS. *Own and evolve containerised deployment pipelines using Docker and CI/CD principles. *Develop and manage data pipelines with ApacheAirflow, with data transformation using Python and Pandas. Essential Skills & Experience Backend & Microservices *Proficiency in Java with a good understanding of microservices architecture. *Experience with Spring Boot, Spring …/CD practices. Data Engineering & Scripting *Scripting and data manipulation skills using Python. (Nice to have) *Proficient with Pandas for handling and transforming complex datasets. *Hands-on experience with ApacheAirflow for data orchestration and pipeline scheduling. Preferred Qualifications *Experience in cloud-based environments (especially AWS). *Familiarity with Legacy migration strategies and monolith-to-microservices transitions. *Background More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
monitor machine learning models for anomaly detection and failure prediction. Analyze sensor data and operational logs to support predictive maintenance strategies. Develop and maintain data pipelines using tools like ApacheAirflow for efficient workflows. Use MLflow for experiment tracking, model versioning, and deployment management. Contribute to data cleaning, feature engineering, and model evaluation processes. Collaborate with engineers and … science libraries (Pandas, Scikit-learn, etc.). Solid understanding of machine learning concepts and algorithms . Interest in working with real-world industrial or sensor data . Exposure to ApacheAirflow and/or MLflow (through coursework or experience) is a plus. A proactive, analytical mindset with a willingness to learn and collaborate. Why Join Us Work on More ❯
GCP, AWS, or Azure). Optimise Python performance through effective use of loops , vectorization , threading , and understanding of the Global Interpreter Lock (GIL) . Desirable Experience Experience with Kubeflow , Airflow , or dbt . Hands-on experience with Google Cloud Platform (GCP) . Knowledge of containerisation (Docker, Kubernetes) and CI/CD practices. Familiarity with data warehouse technologies (BigQuery, Redshift More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash or OpenSearch Willingness to work across the stack by contributing to API development and, at times, UI components (Vue.js, Zoho, or similar). Excellent communication and collaboration More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Adecco
you the chance to work on cutting-edge solutions that make a real impact.Key Responsibilities* Data Engineering: Design and implement data pipelines, lakes, and warehouses using tools like Spark, Airflow, or dbt.* API & Microservices Development: Build secure, efficient APIs and microservices for data integration.* Full Stack Development: Deliver responsive, high-performance web applications using React (essential), plus Angular or More ❯
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
move us towards our vision of scaling up through product led growth. This role will be focused on our backend system (Symfony, PHP) and our data products (BigQuery, DBT, Airflow), but there will be opportunities to work across the platform including, agentic AI (Python, Langchain), frontend (React, TypeScript), the APIs (GraphQL, REST), our integration tool of choice (Tray.ai) and More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
POWWR
or Tableau , including data modeling and performance optimization. Advanced SQL skills, data modeling, and proficiency with dbt Core for modular SQL transformations and testing. Experience orchestrating pipelines with Dagster, Airflow, or similar tools. Familiarity with Python for data manipulation, orchestration, and automation. Experience deploying and managing data workloads on Kubernetes (AKS preferred). Experience working within a DevOps or More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as ApacheAirflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing · Strong analytical skills, including a thorough understanding of how to interpret customer business requirements More ❯
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., ApacheAirflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM More ❯
Stockport, England, United Kingdom Hybrid / WFH Options
Gravitas Recruitment Group (Global) Ltd
and other squads to ensure smooth releases and integration. Key Skills Data Modelling Python & SQL AWS/Redshift 3–5+ years of experience in data engineering Nice to Have Airflow, Tableau, Power BI, Snowflake, Databricks Data governance/data quality tooling Degree preferred Atlassian/Jira, CI/CD, Terraform Why Join? Career Growth: Clear progression to Tech Lead. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
with product managers, data & AI engineers to deliver solutions to major business problems, including Generative AI applications What we’re looking for: Hands-on expertise in Python, SQL, Spark, Airflow, Terraform Proven experience with cloud-native data platforms (AWS, Databricks, Snowflake) Strong track record of mentoring or coaching other engineers Knowledge of CI/CD tooling (Git, Jenkins or More ❯
City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
business • Expert knowledge of modern cloud data platforms (Databricks, Snowflake, ideally AWS) • Advanced Python programming skills and fluency with the wider Python data toolkit • Strong capability with SQL, Spark, Airflow, Terraform, and workflow orchestration tools • Solid understanding of CICD practices using Git, Jenkins, or equivalent tooling • Hands-on experience building both batch and streaming data integration pipelines • Depth in More ❯