Apache Airflow Jobs in the UK

1 to 25 of 262 Apache Airflow Jobs in the UK

Data Engineer (SC cleared)

Guildford, Surrey, United Kingdom
Hybrid / WFH Options
Stott and May
Start: ASAP Duration: 12 months Location: Mostly Remote - must have access to London or Bristol Pay: negotiable, INSIDE IR35 Responsibilities: - Design, implement robust ETL/ELT data pipelines using Apache Airflow - Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS - Develop and maintain data lakes and warehouses (AWS S3, Redshift) - Ensuring governance using automated testing … manage CI/CD pipelines for data deployments and ensure version control of DAGs - Apply best practice in security and compliance Required Tech Skills: - Python and SQL for processing - Apache Airflow, writing Airflow DAGs and configuring airflow jobs - AWS cloud platform and services like S3, Redshift - Familiarity with big data processing using Apache Spark - Knowledge More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Hybrid / WFH Options
Publicis Groupe
privacy, and security, ensuring our AI systems are developed and used responsibly and ethically. Tooling the Future: Get hands-on with cutting-edge technologies like Hugging Face, PyTorch, TensorFlow, Apache Spark, Apache Airflow, and other modern data and ML frameworks. Collaborate and Lead: Partner closely with ML Engineers, Data Scientists, and Researchers to understand their data needs … their data, compute, and storage services. Programming Prowess: Strong programming skills in Python and SQL are essential. Big Data Ecosystem Expertise: Hands-on experience with big data technologies like Apache Spark, Kafka, and data orchestration tools such as Apache Airflow or Prefect. ML Data Acumen: Solid understanding of data requirements for machine learning models, including feature engineering More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer AI-Driven platform! Python/SnowflakeRemote £70k

Manchester, Lancashire, England, United Kingdom
Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform)(Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS)Large-scale data environmentUp to £70,000 plus benefitsFULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
Employment Type: Full-Time
Salary: £70,000 per annum
Posted:

Data Engineer (AI-Driven platform! Python/Snowflake)Remote £70k

Manchester, United Kingdom
Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
Employment Type: Permanent
Salary: £70000/annum
Posted:

Python Data Engineer - Weather Team

London, United Kingdom
P2P
with libraries such as Pandas, NumPy, and FastAPI. Experience with weather and climate datasets and tooling (e.g., Copernicus, Xarray, Zarr, NetCDF). Experience with ETL tools and frameworks (e.g., Apache Airflow, Apache NiFi, Talend). Strong understanding of relational databases and SQL. Experience with cloud platforms (e.g., AWS, GCP, Azure) and their data services. Familiarity with data More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Architect

City of London, London, United Kingdom
HCLTech
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
Posted:

Data Architect

London Area, United Kingdom
HCLTech
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
Posted:

Data Engineer

City, London, United Kingdom
Hybrid / WFH Options
JAM Recruitment Ltd
and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud Data Platforms - Develop … DAGs and configurations. Security & Compliance - Apply encryption, access control (IAM), and GDPR-aligned data practices. Technical Skills & Experience Proficient in Python and SQL for data processing. Solid experience with Apache Airflow - writing and configuring DAGs. Strong AWS skills (S3, Redshift, etc.). Big data experience with Apache Spark. Knowledge of data modelling, schema design, and partitioning. Understanding More ❯
Employment Type: Contract
Rate: GBP 650 - 745 Daily
Posted:

Data Engineer

South West London, London, United Kingdom
Hybrid / WFH Options
JAM Recruitment Ltd
and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud Data Platforms - Develop … DAGs and configurations. Security & Compliance - Apply encryption, access control (IAM), and GDPR-aligned data practices. Technical Skills & Experience Proficient in Python and SQL for data processing. Solid experience with Apache Airflow - writing and configuring DAGs. Strong AWS skills (S3, Redshift, etc.). Big data experience with Apache Spark. Knowledge of data modelling, schema design, and partitioning. Understanding More ❯
Employment Type: Contract, Work From Home
Rate: £650 - £745 per day + Umbrella, inside IR35
Posted:

Machine Learning Engineer

South East London, London, United Kingdom
permutable technologies limited
and evaluation through continuous monitoring and scaling. Build & Optimise AI models in Python: fine-tune state-of-the-art architectures on our in-house GPU cluster. Orchestrate Workflows with Apache Airflow: schedule, monitor, and maintain complex data and model pipelines. Engineer Cloud Services on AWS (Lambda, ECS/EKS, S3, Redshift, etc.) and automate deployments using GitHub Actions … testing, and monitoring. Startup mindset: proactive, resourceful, ambitious, driven to innovate, eager to learn, and comfortable wearing multiple hats in a fast-moving environment. Desirable: hands-on experience with Apache Airflow, AWS services (especially Redshift, S3, ECS/EKS), and IaC tools like Pulumi. Why Permutable AI? Hybrid Flexibility: Spend 2+ days/week in our Vauxhall hub. More ❯
Employment Type: Permanent
Salary: £55,000
Posted:

Customer Senior Sales Engineer - UK, London London

London, United Kingdom
Hybrid / WFH Options
Astronomer Inc
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer … our product's evolution through client feedback. This role is ideal for someone who wants to make a visible impact while growing into an expert in workflow orchestration and Apache Airflow. This is a hybrid role requiring a minimum of 3 days per week onsite, and includes up to 40% travel for business and customer needs. What you get … production. Be a Trusted Advisor: Conduct demos and provide technical guidance to engineering teams, showing them how our platform can transform their workflows. Drive Community Impact: Contribute to the Apache Airflow community by creating technical content and best practices, positioning Astronomer as a thought leader in workflow orchestration. Influence Product Direction: Act as a liaison by gathering field More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Hadoop Engineer - ODP Platform

United Kingdom
Hybrid / WFH Options
Experis
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience More ❯
Employment Type: Contract
Rate: GBP Annual
Posted:

Hadoop Engineer - ODP Platform

England, United Kingdom
Hybrid / WFH Options
Experis - ManpowerGroup
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Hadoop Engineer - ODP Platform

West Midlands, United Kingdom
Hybrid / WFH Options
Experis
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience More ❯
Employment Type: Contract, Work From Home
Posted:

Hadoop Developer

Birmingham, Staffordshire, United Kingdom
Square One Resources
within enterprise-grade on-prem systems. Job Responsibilities/Objectives Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimize workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support … security and data governance standards. Required Skills/Experience The ideal candidate will have the following: Strong experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

DevOps Engineer

United Kingdom
Element Materials Technology
and Data Pipeline Integration Collaborate with data scientists and engineers to support the operationalization of machine learning models. Possess a strong understanding of data pipeline orchestration tools such as Apache Airflow and their integration with cloud infrastructure. Assist in the deployment and monitoring of self-hosted LLMs. Cross-Functional Collaboration: Work closely with stakeholders, including data scientists, software … container orchestration in a production environment. Experience with building and managing CI/CD pipelines (e.g., Jenkins, GitHub Actions, Azure DevOps). Familiarity with MLOps practices and tools including Apache Airflow is a significant plus. Technical Skills: Proficiency in programming languages such as Bash and Python Strong understanding of version control systems (e.g., Git). Solid knowledge of More ❯
Posted:

Senior Software Engineer, Infrastructure

London, United Kingdom
Hybrid / WFH Options
Intercom
doing? Evolve the Data Platform by designing and building the next generation of the stack. Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS. Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs. Develop automation and … quality issues. Recent projects the team has delivered: Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability. Redshift -> Snowflake migration Unified Local Analytics Development Environment for Airflow and DBT Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption. About you You have 5+ years of full-time … might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Python Developer Software Engineer AWS Finance Trading London

London, United Kingdom
Hybrid / WFH Options
Joseph Harry Ltd
Front End ability (Vue, React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income Credit … the team to be in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, United Kingdom
Sandtech
Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying … and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Sandtech
data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for … deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer, Python

London, United Kingdom
Hybrid / WFH Options
Cedar Cares, Inc
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer - Data Ingestion and Pipelines (London)

London, United Kingdom
Intelmatix
. Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD pipelines and DevOps practices. Familiarity with Infrastructure-as-code tools (e.g. Terraform, AWS CDK). Employee Benefits: At Intelmatix, our benefits More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer - AI / ML / Python (Lead Level)

London, United Kingdom
Hybrid / WFH Options
N Consulting Limited
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Governance Engineer

London, United Kingdom
Hybrid / WFH Options
Sumsub
business glossary, and data mapping framework using metadata management and data catalog tools. Automate data classification, lineage tracking, and policy enforcement through scripts, APIs, and orchestration tools (e.g., dbt, Airflow). Map & Visualize Data Flows : Design and maintain clear documentation and visualizations of data movement across systems, focusing on sensitive and business-critical data. Drive Cross-Functional Alignment: Collaborate … governance SME; support teams with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, Apache Airflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and … lineage using scripting languages like Python , SQL , or Java , along with orchestration tools such as Airflow and dbt . 5+ years of experience in data governance, privacy, or data engineering roles-especially in settings that integrate governance tightly into data platform design. Familiarity with privacy-by-design , data minimization , and regulatory standards including GDPR, ISO 27001, SOC 2, and More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Software Engineer, Enterprise Engineering

Manchester, Lancashire, United Kingdom
Roku, Inc
entrepreneurial spirit Excellent verbal and written communication skills BS or MS degree in Computer Science or equivalent Nice to Have Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark Experience in developing Finance or HR related applications Experience with following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with … Terraform Experience in creating workflows for Apache Airflow Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:
Apache Airflow
10th Percentile
£66,250
25th Percentile
£92,500
Median
£110,000
75th Percentile
£135,000
90th Percentile
£137,500