maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
of this team, you will be working on a plethora of services such as Glue (ETL service), Athena (interactive query service), Managed Workflows of ApacheAirflow, etc. Understanding of ETL (Extract, Transform, Load) Creation of ETL Pipelines to extract and ingest data into data lake/warehouse with More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like ApacheAirflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
Python, R, and Java. Experience scaling machine learning on data and compute grids. Proficiency with Kubernetes, Docker, Linux, and cloud computing. Experience with Dask, Airflow, and MLflow. MLOps, CI, Git, and Agile processes. Why you do not want to miss this career opportunity? We are a mission-driven firm More ❯
transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Aubay UK
migration projects, particularly large-scale migrations to distributed database platforms. Hands-on experience with big data processing technologies, including Spark (PySpark and SparkScala) and Apache Airflow. Expertise in distributed databases and computing environments. Familiarity with Enterprise Architecture methodologies, ideally TOGAF. Strong leadership experience, including managing technology teams and delivering More ❯
for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes More ❯
or Java-comfortable reviewing code, guiding design decisions, and supporting technical excellence. Big Data Expertise: In-depth experience with tools like Kafka, Flink, dbt, Airflow, and Airbyte, and a solid understanding of building and operating modern data ecosystems. Cloud Experience: Hands-on experience with AWS, GCP, or both-comfortable More ❯
develop and deploy Feature Engineering and Modeling applications to data platforms built on Databricks or similar platforms and platform components (e.g., Snowflake, ML Flow, Airflow, etc.). Demonstrated experience in using Azure-based cloud applications, services and infrastructure or significant, transferrable experience with other Cloud Providers (e.g., AWS or More ❯
Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh More ❯
technical leadership role on projects and experience with transitioning projects into a support program. Experience with Google Cloud Platform (GCP) services, including Cloud Composer (ApacheAirflow) for workflow orchestration. Strong experience in Python with demonstrable experience in developing and maintaining data pipelines and automating data workflows. Proficiency in … e.g., Git). Strong expertise in Python, with a particular focus on libraries and tools commonly used in data engineering, such as Pandas, NumPy, Apache Airflow. Experience with data pipelines, ELT/ETL processes, and data wrangling. Dashboard analytics (PowerBI, Looker Studio or Tableau) experience. Excellent English, written and More ❯
is required for the role. Equally, proficiency in Python and SQL is essential, ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as ApacheAirflow, Athena, or SageMaker More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as ApacheAirflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
london, south east england, united kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
data flow, storage, and processing • Good logical thinking and attention to detail ⸻ 🌟 Nice-to-Have (But Not Required) : • Experience with data pipeline tools like ApacheAirflow, DBT, or Kafka • Knowledge of cloud data services (AWS S3/Glue/Redshift, GCP BigQuery, Azure Data Factory) • Exposure to Spark More ❯