Git), experiment tracking (e.g. MLflow), and containerisation (e.g. Docker). Familiarity with CI/CD tools (e.g. GitHub Actions), model/data versioning (e.g. DVC), and orchestration frameworks (e.g. Airflow, Dagster). Skilled in testing (unit, integration, end-to-end), and visualising outcomes with tools like seaborn. Ability to translate complex business problems into data science solutions and effectively More ❯
to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if someone's circumstances or timescales require it but our general More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
record migrating large-scale systems (e.g., BigQuery Redshift) Infrastructure as Code - Experience with tools like Terraform Data Engineering: ELT pipeline mastery - Experience with tools like Fivetran, dataform, dbt, and Airflow for building reliable data workflows Custom integrations - Strong Python skills for building data ingestion from third-party APIs, and developing cloud functions Data governance - Experience implementing RBAC, data masking More ❯
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
under pressure. Skills: Programming Languages: Strong proficiency in Python and PySpark. Database Management: Expertise in SQL for data manipulation and querying. Data Orchestration: Experience with orchestration tools such as ApacheAirflow or Dagster. Containerization: Familiarity with containerization technologies, specifically Kubernetes and Docker. Data Pipelines: Proven experience in designing and implementing data pipelines, working with big data technologies and More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Jaguar & Land Rove
data extraction, transformation, analysis, and process automation Hands-on experience with Google Cloud Platform (GCP) or Amazon Web Services (AWS) Proficient in Data Engineering and Orchestration tools such as ApacheAirflow, Glue or Dataform Skilled in creating impactful data visualisations using Tableau, Power BI or Python Background in engineering sectors such as automotive, aerospace, or transport BENEFITS This More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
in software engineering principles. Expertise with AWS services , including Lambda, ECS/EC2, S3 and RDS. Deep experience with Terraform and infrastructure-as-code practices. Familiarity with tools like Airflow or DBT , and data platforms such as Snowflake or Databricks . Solid experience with CI/CD, observability, and platform reliability practices in cloud-native environments. Understanding of distributed More ❯
applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3 More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Experience and Attributes we'd like to see Platform Engineering Expertise Extensive experience in platform engineering; designing, building, and More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
transformation. Deep understanding of cloud-based data architecture, particularly with GCP (BigQuery, Cloud Functions, Pub/Sub, etc.) or AWS equivalents. Hands-on experience with orchestration tools such as Airflow or DBT. 3+ years in data engineering, preferably including at least one role supporting a live or F2P game. Experience with analytics and marketing APIs (e.g. Appsflyer, Applovin, IronSource More ❯
Power markets. Expert in Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open More ❯