to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if someone's circumstances or timescales require it but our general More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
record migrating large-scale systems (e.g., BigQuery Redshift) Infrastructure as Code - Experience with tools like Terraform Data Engineering: ELT pipeline mastery - Experience with tools like Fivetran, dataform, dbt, and Airflow for building reliable data workflows Custom integrations - Strong Python skills for building data ingestion from third-party APIs, and developing cloud functions Data governance - Experience implementing RBAC, data masking More ❯
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Jaguar & Land Rove
data extraction, transformation, analysis, and process automation Hands-on experience with Google Cloud Platform (GCP) or Amazon Web Services (AWS) Proficient in Data Engineering and Orchestration tools such as ApacheAirflow, Glue or Dataform Skilled in creating impactful data visualisations using Tableau, Power BI or Python Background in engineering sectors such as automotive, aerospace, or transport BENEFITS This More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
transformation. Deep understanding of cloud-based data architecture, particularly with GCP (BigQuery, Cloud Functions, Pub/Sub, etc.) or AWS equivalents. Hands-on experience with orchestration tools such as Airflow or DBT. 3+ years in data engineering, preferably including at least one role supporting a live or F2P game. Experience with analytics and marketing APIs (e.g. Appsflyer, Applovin, IronSource More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
scalable, fault-tolerant ETL pipelines with minimal manual intervention. Knowledge of data modelling best practices, including the medallion architecture or comparable frameworks. Experience in workflow orchestration using Flyte, dbt, Airflow, or Prefect. Strong understanding of unit, integration, and data validation testing using tools like Pytest or Great Expectations. Familiarity with cloud infrastructure (preferably Azure) for managing pipelines and storage More ❯
leading and managing technical teams, with excellent people development skills. Strong project management skills, with experience running complex data initiatives. Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience More ❯
in the face of many nuanced trade offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to chart More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
have the chance to work with a talented and engaged team on an innovative product that connects with external systems, partners, and platforms across the industry. Our Tech Stack: ApacheAirflow Python Django React/Typescript AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Bitbucket Your mission Lead and scale multiple engineering More ❯
Strong SQL and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid More ❯
streaming data solutions Proficiency in Python, SQL, and data modelling tools (e.g. Erwin, Lucidchart) Strong knowledge of AWS services (Lambda, SNS, S3, EKS, API Gateway) Familiarity with Snowflake, Spark, Airflow, DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯