forward-thinking team where your growth is genuinely encouraged, this could be the perfect next step! Why This Role? ✅ Work with modern tech - Snowflake, DBT, Python, Azure ✅ Be part of a friendly, collaborative, remote-first team ✅ Get real investment in your training & development ✅ Take your career to the next level … native tech Shape reusable data engineering patterns Communicate clearly with both technical and non-technical stakeholders Scope and evolve solutions alongside clients Tech: Snowflake, DBT, Airflow, Azure Data Factory Python, Spark, SQL Cloud (Azure, AWS, or GCP) Power BI, Tableau, Looker Modern architecture concepts Why Join? Remote-first, with a More ❯
of scalable data pipelines on GCP and Snowflake . What You'll Do Design, develop, and optimize robust data pipelines using technologies such as dbt, Fivetran, and Airflow . Lead data integration, modeling, and governance initiatives in close collaboration with product, analytics, and data science teams. Act as a technical … leading data engineering projects in cloud environments , ideally GCP and Snowflake . Proficiency in Python and advanced SQL , with experience designing ELT architectures using dbt . Strong knowledge of data modeling , Snowpipe Streaming , warehouse optimization , and orchestration tools like Apache Airflow . Experience working in distributed, cross-functional teams with More ❯
Data Engineer and other members of the Data Engineering team to design and deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Contributing … to the architectural design of the new Enterprise Data Platform. Contributing to development best practices. Development of data ingestion/transformation pipelines using Fivetran, DBT and Gitlab. Creation of management information dashboards. Work with business analysts and end-users to plan and implement feature enhancements and changes to existing systems More ❯
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
london, south east england, united kingdom Hybrid / WFH Options
Undisclosed
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Undisclosed
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
the Design and Evolution of Data Models: Architecting and overseeing the development of sophisticated, high-performance data models within our cloud data warehouse (BigQuery, dbt), ensuring they are scalable, maintainable, and effectively serve the complex analytical and operational needs of the entire organisation. You will be instrumental in defining and … performant data models in enterprise-grade cloud environments (e.g., BigQuery, Snowflake, Redshift), consistently optimising for scale and cost-efficiency. Demonstrate mastery of SQL and dbt, exhibiting expertise in advanced SQL techniques and extensive experience in developing, optimising, and troubleshooting data transformations within dbt, including more advanced features such as macros … materialisation configurations. Have strong proficiency in ELT processes and data orchestration, evidenced by a thorough understanding of methodologies and significant hands-on experience using dbt, complemented by practical experience with orchestration tools such as Dagster or Airflow. Have a proven track record of successful semantic layer implementation, showcasing experience in More ❯
Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources … You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will More ❯
Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources … You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will More ❯
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
london, south east england, united kingdom Hybrid / WFH Options
twentyAI
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
CD pipelines for Snowflake data workflows Automate AWS infrastructure with Terraform Deploy containers via Docker & Kubernetes (EKS) Ensure observability, performance & security Integrate tooling like dbt, Airflow, and support platform evolution Requirements 3+ years in DevOps/cloud/SRE roles Expert in AWS (EC2, IAM, Lambda, etc.) Strong in Terraform … in Python, Bash, or similar Strong team collaboration & Agile mindset Nice to Have Background in commodity trading, finance, or energy Experience with data governance, dbt, or Airflow More ❯
team in the fashion & retail sector . This role is focused on hands-on development and ownership of key data architecture initiatives, especially around dbt and Airflow . What we're looking for: 2-3+ years of solid, hands-on experience with dbt (must-have) 4+ years of experience More ❯
maintain ETL/ELT pipelines to transform raw data into actionable insights Build and optimize data models for scalability and performance in tools like dbt Collaborate with analysts and product teams to deliver reliable datasets for reporting and analysis Monitor and improve data quality using validation frameworks. Proactively identify data … alignment across teams Experience Experience in a similar analytics engineeringrole Strong SQL skills and experience with data warehouses (e.g., Redshift, Snowflake, BigQuery) Proficiency with dbt or similar data transformation tools Experience with BI tools (e.g., Tableau, Looker) for creating reports and dashboards Knowledge of data governance and best practices for More ❯
strategically, mentor others, and bring experience in implementing and scaling modern data workflows. Projects you might lead or shape includ Expanding and evolving our dbtdata models to accelerate the work of the Analytics Team by creating scalable, well-documented data transformations and core data asset Designing and overseeing a … audiences. Deep curiosity and knowledge of the modern data stack and industry trends. Fluent English (written and spoken) Nice to have Extensive experience with dbt in production environments. Proficiency in Python and version control systems like Git. Familiarity with Jinja templating and macros. Experience with BI tools like Looker or More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
for ingesting, transforming, and delivering data to critical internal systems. This includes designing scalable AWS-based pipelines, integrating external APIs, and orchestrating transformations using DBT and Airflow. You'll also support the transition of R-based data streams into more maintainable Python workflows. Your responsibilities will include: Building and maintaining … third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND More ❯
london, south east england, united kingdom Hybrid / WFH Options
Harnham
for ingesting, transforming, and delivering data to critical internal systems. This includes designing scalable AWS-based pipelines, integrating external APIs, and orchestrating transformations using DBT and Airflow. You'll also support the transition of R-based data streams into more maintainable Python workflows. Your responsibilities will include: Building and maintaining … third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Harnham
for ingesting, transforming, and delivering data to critical internal systems. This includes designing scalable AWS-based pipelines, integrating external APIs, and orchestrating transformations using DBT and Airflow. You'll also support the transition of R-based data streams into more maintainable Python workflows. Your responsibilities will include: Building and maintaining … third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND More ❯
CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks … Catalog. Excellent communication, leadership, and problem-solving skills. Desirable: Databricks certifications (e.g., Data Engineer Associate/Professional or Solutions Architect). Familiarity with MLflow, dbt, and BI tools such as Power BI or Tableau. Exposure to MLOps practices and deploying ML models within Databricks. Experience working within Agile and DevOps More ❯
Job Title: Principal Software Engineer Location: San Francisco Bay Area Job Type: Open to remote for highly experienced candidates (6+ years minimum) Summary: My client is a biotech startup leveraging AI/ML technologies to develop precision cancer therapeutics. We More ❯
Job Title: Principal Software Engineer Location: San Francisco Bay Area Job Type: Open to remote for highly experienced candidates (6+ years minimum) Summary: My client is a biotech startup leveraging AI/ML technologies to develop precision cancer therapeutics. We More ❯
Job Title: Principal Software Engineer Location: San Francisco Bay Area Job Type: Open to remote for highly experienced candidates (6+ years minimum) Summary: My client is a biotech startup leveraging AI/ML technologies to develop precision cancer therapeutics. We More ❯