time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a strong culture of More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
an Agile environment. Technical Proficiency: Deep technical expertise in software and data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic initiatives. Drive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic initiatives. Drive More ❯
experience in data engineering or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be More ❯
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Count Technologies Ltd
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP preferred) and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP preferred) and More ❯
and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through mentorship, feedback, and knowledge sharing. Pragmatic Problem More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and More ❯
timely decisions in the face of many nuanced trade offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
high-quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Salt Search
customer insights Comfortable working with large datasets from sources like CRM, web analytics, product telemetry, etc. Exposure to cloud platforms (AWS, GCP, Azure) and modern data pipelines (e.g. Airflow, dbt) is a plus Soft Skills: Business-oriented thinker with strong communication skills Able to clearly explain complex models to non-technical audiences Skilled in stakeholder engagement and translating analytics into More ❯
technical teams, with excellent people development skills. Strong project management skills, with experience running complex data initiatives. Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience working in agile environments More ❯
Experience Strong SQL and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid More ❯