Apache Airflow Jobs in England

126 to 150 of 845 Apache Airflow Jobs in England

Senior DevOps Engineer (PA2025Q1JB012)

Basildon, Essex, United Kingdom
SS&C
as the ability to learn quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in Python, Java, awk, sed, Ansible More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Tenzo Limited
professional to make a significant impact at Tenzo. This role is pivotal in shaping how our product integrates and interacts with external systems, partners, and platforms. Our Tech Stack: Apache Airflow Python Django AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Your mission Design and develop data pipelines, orchestrating key activities such … in SQL and experience with relational databases such as PostgreSQL , including database administration, tuning, and optimisation ( Highly desirable ). Experience with data pipeline and workflow management tools such as Apache Airflow ( Nice to have ). Proficiency in Git ( Important ). Ability and eagerness to write high-quality code, technical documentation, architecture diagrams, and production plans ( Important ). Strong More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Infrastructure Engineer

London, England, United Kingdom
Hybrid / WFH Options
Intercom
doing? Evolve the Data Platform by designing and building the next generation of the stack. Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS. Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs. Develop automation and … quality issues. Recent projects the team has delivered: Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability. Redshift -> Snowflake migration Unified Local Analytics Development Environment for Airflow and DBT Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption. About you You have 5+ years of full-time … might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would More ❯
Posted:

Data Engineer

London, England, United Kingdom
Lumilinks Group Ltd
Please speak to us if you have ..... .....the following professional aspirations Skill Enhancement: Aspires to deepen technical expertise in data engineering practices, including mastering tools and technologies like Apache Spark, Kafka, cloud platforms (AWS, Azure, Google Cloud), and data warehousing solutions. Career Progression : Aims to advance to a senior data engineer or data architect role, with long-term … Redshift, Google BigQuery, Snowflake, or Azure Synapse Analytics, including data modelling and ETL processes. ETL Processes: Proficient in designing and implementing ETL (Extract, Transform, Load) processes using tools like Apache NiFi, Talend, or custom scripts. Familiarity with ELT (Extract, Load, Transform) processes is a plus. Big Data Technologies : Familiarity with big data frameworks such as Apache Hadoop and … Apache Spark, including experience with distributed computing and data processing. Cloud Platforms: Proficient in using cloud platforms (e.g., AWS, Google Cloud Platform, Microsoft Azure) for data storage, processing, and deployment of data solutions. Data Pipeline Orchestration : Experience with workflow orchestration tools such as Apache Airflow or Prefect to manage and schedule data pipelines. Data Modelling : Strong understanding More ❯
Posted:

Data Engineer

London Area, United Kingdom
CACTUS
Spark (PySpark). Experience with Azure Databricks, Delta Lake , and data architecture . Familiarity with Azure cloud , version control (e.g., Git), and DevOps pipelines . Experience with tools like Apache Airflow, dbt , and Power BI . Working knowledge of NoSQL databases , API integration, and economic/financial data . Azure certifications (e.g., Azure Data Fundamentals) are highly desirable. More ❯
Posted:

Data Engineer

City of London, London, United Kingdom
CACTUS
Spark (PySpark). Experience with Azure Databricks, Delta Lake , and data architecture . Familiarity with Azure cloud , version control (e.g., Git), and DevOps pipelines . Experience with tools like Apache Airflow, dbt , and Power BI . Working knowledge of NoSQL databases , API integration, and economic/financial data . Azure certifications (e.g., Azure Data Fundamentals) are highly desirable. More ❯
Posted:

Data Engineer

South East London, England, United Kingdom
CACTUS
Spark (PySpark). Experience with Azure Databricks, Delta Lake , and data architecture . Familiarity with Azure cloud , version control (e.g., Git), and DevOps pipelines . Experience with tools like Apache Airflow, dbt , and Power BI . Working knowledge of NoSQL databases , API integration, and economic/financial data . Azure certifications (e.g., Azure Data Fundamentals) are highly desirable. More ❯
Posted:

Lead Engineer, Data Platform

London, England, United Kingdom
Hybrid / WFH Options
Scope3
for backend applications with Prisma ORM & PostgreSQL REST and GraphQL APIs React w/Next.js for frontend applications Low latency + high throughput Golang API Big Query Data warehouse Airflow for batch orchestration Temporal for event orchestration Apache Beam (dataflow runner) for some batch jobs Most transformations are performed via SQL directly in Big Query. The Role We … environments across distributed teams Experience with Google Cloud Platform and/or Amazon Web Services Expertise in Python, SQL Big Query or equivalent data warehouse experience (Redshift, Snowflake, etc.) Airflow or equivalent in-house data platform experience (Prefect, Dagster, etc.) Experience with Clickhouse Demonstrated experience perpetuating an inclusive and collaborative working environment Preference may be given to candidates with More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Solirius Reply
Design and build data models, schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client and cross-functional teams to … relational databases (e.g. MS SQL/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Spotify
the next generation of personalized generative voice products at scale. What You'll Do Build large-scale speech and audio data pipelines using frameworks like Google Cloud Platform and Apache Beam Work on machine learning projects powering new generative AI experiences and helping to build state-of-the-art text-to-speech models Learn and contribute to the team … Storm, Spark, Flink, etc. You have strong Python programming abilities Experience using pre-trained ML models is a plus You might have worked with Docker as well as Luigi, Airflow, or similar tools You care about quality and know what it means to ship high-quality code You have experience managing data retention policies You care about agile software More ❯
Posted:

Senior Data Engineer (Databricks) - UK

London, England, United Kingdom
Hybrid / WFH Options
Datapao
work for the biggest multinational companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. Additionally, at this seniority level … proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You’re proficient in SQL and Python , using them to transform and optimize data like a pro; You know your way around CI/… no shortage of learning opportunities at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for learning during work hours, and access More ❯
Posted:

Data Engineer - Leading Fashion Company - London

London, England, United Kingdom
Hybrid / WFH Options
Noir
Responsibilities Design, build, and maintain robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. More ❯
Posted:

Data Engineer

London, England, United Kingdom
Sporty Group
relational and NoSQL databases. Experience with data modelling. General understanding of data architectures and event-driven architectures. Proficient in SQL. Familiarity with one scripting language, preferably Python. Experience with Apache Airflow & Apache Spark. Solid understanding of cloud data services: AWS services such as S3, Athena, EC2, RedShift, EMR (Elastic MapReduce), EKS, RDS (Relational Database Services) and Lambda. More ❯
Posted:

Software Engineer - Python

London, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as Apache Hive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
Posted:

Senior Data Engineer

London, United Kingdom
Randstad (Schweiz) AG
containerization and CI/CD tools (e.g., Docker, GitHub Actions). Knowledge of networking and cloud infrastructure (e.g., AWS, Azure). Experience with modern data processing frameworks (e.g., dbt, Apache Airflow, Spark, or similar). Requirements A strong focus on system observability and data quality. Emphasis on rapid scalability of solutions ( consider market ramp up when entering a More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Artefact
leading data projects in a fast-paced environment. Key Responsibilities Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark. Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms. Implement continuous integration and continuous deployment (CI/ More ❯
Posted:

Software Engineer - Python - Data

Slough, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as Apache Hive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Noir
Responsibilities Design, build, and maintain robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data More ❯
Posted:

Data engineer | NL/UK/CZ

London, England, United Kingdom
Hybrid / WFH Options
ScanmarQED
Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data More ❯
Posted:

Senior Data Engineer (Databricks)

London, England, United Kingdom
Hybrid / WFH Options
DATAPAO
industries) on some of our most complex projects - individually or by leading small delivery teams. Our projects are fast-paced, typically 2 to 4 months long, and primarily use Apache Spark/Databricks on AWS/Azure. You will manage customer relationships either alone or with a Project Manager, and support our pre-sales, mentoring, and hiring efforts. What … on cloud platforms (AWS, Azure, GCP); Proven experience with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); Extensive ETL/ELT and data pipeline orchestration experience (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions); Proficiency in SQL and Python for data transformation and optimization; Knowledge of CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation, Bicep); Hands-on More ❯
Posted:

Lead Data Engineer

London, United Kingdom
Hybrid / WFH Options
QiH Group
skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

L3 Data Support Engineer

London, United Kingdom
Tcr International
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

L3 Data Support Engineer

Manchester, Lancashire, United Kingdom
Tcr International
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer (Healthcare Data)

London, United Kingdom
Pangaea Data Limited
on healthcare data preferred. Familiarity with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., Apache Airflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7, FHIR, ICD, SNOMED, DICOM) preferred. More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

L3 Data Support Engineer

Manchester, England, United Kingdom
Tcr International
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions More ❯
Posted:
Apache Airflow
England
10th Percentile
£69,750
25th Percentile
£97,500
Median
£110,000
75th Percentile
£137,500
90th Percentile
£138,750