Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result, they require someone from a strong SQL … Python development background with excellent working knowledge of DataBuildTool (DBT). You will be undertaking aspects of the development lifecycle and be experienced in data modeling, process design, development, and testing. And whilst this company is going through a large-scale migration, this will present you with an opportunity to be at the cutting edge of data engineering. … have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of DataBuildTool (DBT). o Develop staging, intermediate and marts in DBT to achieve analytics requirements o Optimize existing models to make it more reusable by following DBT best practices o Spot opportunities More ❯
Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience More ❯
modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS, IAM. Experience More ❯
to note that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
note that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Corriculo Ltd
performance Collaborate with development and data teams to support data-driven features and insights Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Data Engineer and are available More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Corriculo Ltd
performance Collaborate with development and data teams to support data-driven features and insights Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Data Engineer and are available More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in London if More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Corriculo Ltd
on tribal knowledge and manual interventions. Enabling future scalability by establishing robust, modular foundations Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Lead Data Engineer/Tech More ❯
month contract.1-2 days per weekFully remoteOutside IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (Apache Airflow More ❯
per week Fully remote Outside IR35 Immediate start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (Apache Airflow More ❯
Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement … data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as … a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or similar) . Proficiency in Python for scripting and pipeline development. Experience More ❯
Engineering Consultant, key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Data ingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working in large development teams More ❯
Power BI. Proficiency in SQL and data profiling for test design and validation. Hands-on experience with test automation frameworks such as Python/PySpark, Great Expectations, Pytest, or dbt tests. Practical understanding of CI/CD integration (Azure DevOps, GitHub Actions, or similar). Strong problem-solving skills and the ability to work independently as the lead testing specialist. More ❯
is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (DataBuildTool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of how BI tools like AWS QuickSight consume data, and the ability to structure datasets More ❯
is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (DataBuildTool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of how BI tools like AWS QuickSight consume data, and the ability to structure datasets More ❯
pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (DataBuildTool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive More ❯
an initial 6 months experienced based out of their London office. You will play a key role in designing and managing our Snowflake data warehouse and leveraging dbt (DataBuildTool) to transform raw data into reliable, analysis-ready datasets that support regulatory compliance, operational efficiency, and innovation. Responsibilities: Design, develop, and maintain scalable data pipelines to support manufacturing, quality … and supply chain data workflows. Implement and manage data transformation models using dbt to standardise and validate datasets. Optimise and monitor performance of Snowflake data warehouse environments. Collaborate with cross-functional teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g. … z2bz0 years of experience as a Data Engineer, ideally in a pharmaceutical, biotech, or regulated manufacturing environment. Strong hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Data Engineer (Azure, Snowflake, DBT) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced Data Engineer to join a major insurance client engagement.The role focuses on building out a Snowflake Data Warehouse established last year and scaling it to support multiple new data use cases across … umbrella company admin fees) Start Date: Immediate/End of June Role Overview You'll be working within a growing data engineering function, focused on scaling a Snowflake + DBT platform to support multiple analytical and operational use cases. The team is looking for an experienced engineer with strong technical depth and an insurance background, capable of owning and extending … the Azure and Snowflake stack. Key Skills & Experience Strong hands-on experience with Snowflake Cloud Data Warehouse (schemas, RBAC, performance tuning, ELT best practices). Proven commercial experience with DBT for modular data modelling, testing, documentation, and CI/CD integration. Skilled in Azure Data Factory, Synapse, and Databricks for end-to-end data pipeline orchestration. Excellent SQL engineering capability More ❯
Galway, Galway County, Republic of Ireland Hybrid / WFH Options
CompuStaff
join a high-growth, innovation-led team shaping the next generation of cloud security solutions. What You’ll Do Design, build, and maintain scalable data pipelines and models using dbt and Snowflake . Develop automated workflows and ETL processes in Python to ensure high reliability and performance. Collaborate with analytics and engineering teams to turn business needs into robust data … help build a best-in-class data culture. What You’ll Bring Proven experience as a Data Engineer in a fast-moving tech or SaaS environment. Strong expertise in dbt Core , Snowflake , Python , and SQL . Experience with ETL/ELT pipelines , workflow orchestration (ideally Prefect ), and AWS (S3, ECS). Familiarity with BI governance, compliance, and CI/CD More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
transform datasets from various sources, ensuring data quality and integrity.* Optimise and maintain the organisation's Snowflake infrastructure, providing performance tuning, enhancements, and regular "MOT" checks.* Develop and maintain DBT pipelines, mainly on Snowflake, to enable efficient transformation and modelling of data.* Implement efficient data models and schemes for storage and retrieval, optimising processes for improved performance.* Troubleshoot and resolve … in data engineering and cloud infrastructure, continuously improving skills and knowledge. Profile * The Data Engineer will have proven experience working with Snowflake infrastructure, including optimisation and maintenance.* Experience with DBT for data transformation and modelling, ideally in a Snowflake environment.* Proficient in SQL and experienced in database design and administration.* Familiarity with cloud platforms such as Azure and AWS.* Excellent More ❯
highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/… Collaborating on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have More ❯
shortened version of the Full JD. What youll do Translate business requirements into scalable, well-structured data models and dashboards. Build and maintain Business Layers and data transformations using dbt , SQL , and BigQuery (GCP) . Develop and refine Tableau dashboards with strong UX and storytelling focus. Own data documentation, lineage, and governance using Atlan . Review and mentor on coding … standards, testing, and best practice. Collaborate with Product, BI, and Data Architecture teams in an Agile setup (Jira). Tech environment dbt | SQL | BigQuery (GCP) | Tableau | Atlan | Git | Jira Ideal profile Expert in SQL and dbt with solid data modelling and warehousing experience. Proven experience delivering production-grade dashboards in Tableau. Strong understanding of data governance and data mesh principles. More ❯