strategic impact. You will provide technical leadership and mentorship within the team. Core Responsibilities Advanced Data Modelling & Transformation: Lead the design, development, and optimization of complex data pipelines using dbt to transform data from the bronze layer to curated gold layer models, strictly adhering to and evolving Kimball methodologies for diverse analytical needs. Architect and build sophisticated aggregation layers to … Leadership & Excellence: Act as a subject matter expert in SQL (Postgres, Cloud SQL, BigQuery, Redshift), driving performance optimization and complex query development. Drive the adoption of best practices for dbt development, including modularity, testing, and documentation, across the team. Influence the selection and implementation of workflow orchestration tools, leveraging tools like Airflow for building highly reliable, automated, and scalable data … on and contribute to Infrastructure as Code (IaC) principles (e.g., Terraform) to manage the analytics environment. Oversee and optimize CI/CD pipelines via GitHub for efficient and reliable dbt model deployments. Contribute significantly to the design and implementation of comprehensive data quality and governance frameworks using tools like Datahub and Great Expectations. Proactively identify and address performance bottlenecks in More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling: Designing star/ More ❯
love to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
London, England, United Kingdom Hybrid / WFH Options
Ziff Davis
attention to detail and a commitment to quality. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Extensive experience with ETL/ELT and managing data pipelines Familiarity with Snowplow Experience integrating data from various More ❯
or large-scale martech/data projects Comfortable leading client-facing technical engagements, from roadmap to delivery Strong working knowledge of SQL, Python, and modern ETL tooling (e.g. Airflow, DBT, Spark) Familiar with cloud platforms (AWS, GCP, or Azure) and modern data stack components Experience in agile delivery environments, comfortable managing sprints and backlogs A curious mindset with a passion More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Dept Agency
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be More ❯
strong commitment to data integrity. Excellent problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications Experience with Databricks and Databricks Unity Catalog. Familiarity with dbt and Airflow. Experience with data quality frameworks. Understanding of ML requirements and experience working with ML teams. Experience in robotics or a related field. Familiarity with cloud-based data storage More ❯
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
experience building and maintaining data pipelines Knowledge of ETL/ELT practices Understanding of cloud infrastructure (AWS, GCP) Nice to Have Terraform or infrastructure-as-code tools Docker, Kubernetes dbt BI tools like Looker or Power BI Data quality tools Media, advertising, or audio industry background Personal & Behavioral Traits Curious: Eager to learn new technologies Solution-Oriented: Focused on practical More ❯
detail and care about the features they implement. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Strong Experience with ETL/ELT and the management of data pipelines Familiarity with snowplow Experience with DataMore ❯
London, England, United Kingdom Hybrid / WFH Options
Count
reliable data-focused backend services. Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s), and data pipelines (SQL, dbt, Airbyte). Love automation, process improvement, and finding ways to help others work efficiently. Are comfortable working autonomously and taking responsibility for large technical projects. Are eager to learn from More ❯
London, England, United Kingdom Hybrid / WFH Options
Count Technologies Ltd
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Canonical
of popular, open-source, machine learning tools, such as Kubeflow, MLFlow, DVC, and Feast. You may also work on workflow, ETL, data governance and visualization tools like Apache SuperSet, dbt, and Temporal, or data warehouse solutions such as Apache Trino, or ClickHouse. Your team will own a solution from the analytics and machine learning space, and integrate with the solutions More ❯
thrive in a dynamic startup environment. Passionate about knowledge sharing and mentoring. Eligibility to work in London. Bonus Experience Hands-on experience working with cloud technologies. Familiarity with Airflow, DBT, REST API, Kubernetes, Istio and Docker. Experience delivering simple web based UIs to visualize data. Experience working with petabyte datasets. Experience with middleware such as Chronicle Queue, Aeron, RabbitMQ and More ❯
thrive in a dynamic startup environment. Passionate about knowledge sharing and mentoring. Eligibility to work in London. Bonus Experience Hands-on experience working with cloud technologies. Familiarity with Airflow, DBT, REST API, Kubernetes, Istio and Docker. Experience delivering simple web based UIs to visualize data. Experience working with petabyte datasets. Experience with middleware such as Chronicle Queue, Aeron, RabbitMQ and More ❯
They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Preferred Education Bachelor's Degree Required Technical And Professional Expertise Commercial experience as a Data Engineer or similar role, with a strong emphasis More ❯
They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Preferred Education Bachelor's Degree Required Technical And Professional Expertise Commercial experience as a Data Engineer or similar role, with a strong emphasis More ❯
better if you’ve got experience doing this in Looker/Superset. You are proficient with SQL and know your way around data pipeline management tools such as Snowflake, dbt, Git, Airflow and Python (all analysts at Wise are expected to be full-stack and should be comfortable owning analytics from ingestion to insights). Nice To Have But Not More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Starling Bank
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯