large datasets. Proven experience developing and maintaining production-level codebases . Experience with any public cloud provider (GCP, AWS, or Azure). Desirable Experience Experience with Kubeflow , Airflow , or dbt . Hands-on experience with Google Cloud Platform (GCP) . Knowledge of containerisation (Docker, Kubernetes) and CI/CD practices. Familiarity with data warehouse technologies (BigQuery, Redshift, Snowflake, etc.). More ❯
Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS, IAM. Experience More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Corriculo Ltd
performance Collaborate with development and data teams to support data-driven features and insights Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Data Engineer and are available More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Corriculo Ltd
performance Collaborate with development and data teams to support data-driven features and insights Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Data Engineer and are available More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in London if More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Corriculo Ltd
on tribal knowledge and manual interventions. Enabling future scalability by establishing robust, modular foundations Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Lead Data Engineer/Tech More ❯
month contract.1-2 days per weekFully remoteOutside IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (Apache Airflow More ❯
per week Fully remote Outside IR35 Immediate start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (Apache Airflow More ❯
Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement … data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as … a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or similar) . Proficiency in Python for scripting and pipeline development. Experience More ❯
is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (DataBuildTool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of how BI tools like AWS QuickSight consume data, and the ability to structure datasets More ❯
is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (DataBuildTool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of how BI tools like AWS QuickSight consume data, and the ability to structure datasets More ❯
an initial 6 months experienced based out of their London office. You will play a key role in designing and managing our Snowflake data warehouse and leveraging dbt (DataBuildTool) to transform raw data into reliable, analysis-ready datasets that support regulatory compliance, operational efficiency, and innovation. Responsibilities: Design, develop, and maintain scalable data pipelines to support manufacturing, quality … and supply chain data workflows. Implement and manage data transformation models using dbt to standardise and validate datasets. Optimise and monitor performance of Snowflake data warehouse environments. Collaborate with cross-functional teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g. … z2bz0 years of experience as a Data Engineer, ideally in a pharmaceutical, biotech, or regulated manufacturing environment. Strong hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Data Engineer (Azure, Snowflake, DBT) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced Data Engineer to join a major insurance client engagement.The role focuses on building out a Snowflake Data Warehouse established last year and scaling it to support multiple new data use cases across … umbrella company admin fees) Start Date: Immediate/End of June Role Overview You'll be working within a growing data engineering function, focused on scaling a Snowflake + DBT platform to support multiple analytical and operational use cases. The team is looking for an experienced engineer with strong technical depth and an insurance background, capable of owning and extending … the Azure and Snowflake stack. Key Skills & Experience Strong hands-on experience with Snowflake Cloud Data Warehouse (schemas, RBAC, performance tuning, ELT best practices). Proven commercial experience with DBT for modular data modelling, testing, documentation, and CI/CD integration. Skilled in Azure Data Factory, Synapse, and Databricks for end-to-end data pipeline orchestration. Excellent SQL engineering capability More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
transform datasets from various sources, ensuring data quality and integrity.* Optimise and maintain the organisation's Snowflake infrastructure, providing performance tuning, enhancements, and regular "MOT" checks.* Develop and maintain DBT pipelines, mainly on Snowflake, to enable efficient transformation and modelling of data.* Implement efficient data models and schemes for storage and retrieval, optimising processes for improved performance.* Troubleshoot and resolve … in data engineering and cloud infrastructure, continuously improving skills and knowledge. Profile * The Data Engineer will have proven experience working with Snowflake infrastructure, including optimisation and maintenance.* Experience with DBT for data transformation and modelling, ideally in a Snowflake environment.* Proficient in SQL and experienced in database design and administration.* Familiarity with cloud platforms such as Azure and AWS.* Excellent More ❯
highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/… Collaborating on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half Technology are assisting a global Pharmaceutical organisation to recruit a Data Engineer on a contract basis - Hybrid Working - London based - Outside IR35 Deep expertise in Snowflake and DBT to join a mission-driven analytics team. This role is critical to assessing and improving an established Snowflake environment, driving cost optimisation, enhancing operational rigour, and supporting the team's … predictive analytics and ML Ops workflows. Role The Data Engineer will own and assess the Snowflake platform, proposing and executing optimisation around performance, cost, scalability, and architecture improvements. Lead DBT development and maintenance, with emphasis on test frameworks, transformation efficiency, and eliminating redundant or resource-heavy flows (e.g., excessive temp table usage). Design, implement, and maintain CI/CD … drive alignment and adoption of data initiatives. Profile The Data Engineer will have substantial experience as a Data Engineer, especially working with Snowflake in production environments. Advanced experience with DBT, including testing, modular modelling, and optimisation. Track record in building or maintaining CI/CD pipelines for data workflows. Strong familiarity with Snowflake infrastructure, performance tuning, and cost modelling. Excellent More ❯
SQL Analyst, CDP & CRM Segmentation London based - hybrid working - 3 days on site 3-6 Month contract - Inside IR35 Why this role exists We're standing up a Customer Data Platform (CDP) and need a hands-on SQL analyst to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
We're looking for a DBT & BigQuery specialist to join a dynamic team focused on building and delivering high-value data use cases. If you're hands-on with data and enjoy turning business questions into actionable insights, this could be the role for you. What you'll do: Work on allocated pieces of work, primarily building out data use … cases. Develop and maintain DBT models on BigQuery. Collaborate with the wider team to ensure high-quality, scalable solutions. What we're looking for: Strong DBT and BigQuery expertise - technical skills are more important than consulting experience. Experience working in a fast-paced, hands-on data environment. UK-based and able to work remotely as needed. Why this role is More ❯