london, south east england, united kingdom Hybrid / WFH Options
Trust In SODA
and energy. What’s on offer: Remote-first setup (just 1 day a month in London or client site) High-impact projects using Snowflake, dbt, Airflow, Terraform, AWS Small, agile teams led by Principal Architects A flat structure, collaborative environment, and real influence on technical direction Perks include annual data … conference trips, bonus, and private medical Your profile: Deep hands-on experience with Snowflake , SQL , and dbt Strong understanding of data orchestration tools like Airflow Cloud-first mindset (AWS preferred, Azure also welcome) Comfort with IaC (Terraform) , containers (Docker/K8s) , and CI/CD Strong communicator with a consultancy More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Trust In SODA
and energy. What’s on offer: Remote-first setup (just 1 day a month in London or client site) High-impact projects using Snowflake, dbt, Airflow, Terraform, AWS Small, agile teams led by Principal Architects A flat structure, collaborative environment, and real influence on technical direction Perks include annual data … conference trips, bonus, and private medical Your profile: Deep hands-on experience with Snowflake , SQL , and dbt Strong understanding of data orchestration tools like Airflow Cloud-first mindset (AWS preferred, Azure also welcome) Comfort with IaC (Terraform) , containers (Docker/K8s) , and CI/CD Strong communicator with a consultancy More ❯
of Data Analytics, Software Development, and Data Engineering. You'll a pivotal role in designing complex data pipelines, managing data warehouses using tools like dbt and Airflow, and enabling predictive analytics solutions for our diverse group of clients. As part of a consultancy, you'll have the opportunity to work … Work in a dynamic consultancy environment where creativity, autonomy, and teamwork are valued. Cutting-Edge Tools: Gain hands-on experience with leading tools like dbt, Airflow, Snowflake, and cloud platforms. Key Responsibilities Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using Airflow More ❯
Job Title: Contract Analytics Engineer (DBT & Snowflake Specialist) Location: Hybrid (at least 1 day per week in London) Contract … Length: 6 Months Rate: Up to £575 per day (Outside IR35) We are seeking an experienced Analytics Engineer with deep expertise in DBT (DataBuildTool) Core and Snowflake to join a fast-moving data team on a 6-month contract . This is an exciting opportunity to work on … leveraging your skills in data modelling, data analysis, and Snowflake to drive actionable insights for a dynamic organisation. Key Responsibilities: Design, develop, and optimise DBT models to transform raw data into valuable analytics-ready datasets. Utilise your in-depth knowledge of DBT Core to upskill, train and mentor the team. More ❯
company Experience building or maintaining third-party or in-house data quality and cataloguing solutions Experience with documentation of system architecture Pandas, Jupyter, Plotly DBT, Kafka BI tools such as Tableau, Metabase and Superset The current tech stack: Airflow Clickhouse DBT Python MongoDB PostgreSQL MariaDB Kafka K8s AWS FXC Intelligence More ❯
the latest tools and trends, especially in AWS and open-source data tech. What you bring: Expert-level SQL and strong scripting (Python/DBT). Deep knowledge of ETL/ELT processes, CI/CD, and data performance tuning. Strong communication skills, with a knack for translating complex tech … mentoring experience. Nice to have: Hands-on experience with Snowflake, Airflow, AWS Glue, Spark, and S3. Familiarity with open-source data libraries (e.g., Pandas, DBT). Experience with modern data stacks and AWS cloud services. This is your chance to shape the future of our data ecosystem from the ground More ❯
the latest tools and trends, especially in AWS and open-source data tech. What you bring: Expert-level SQL and strong scripting (Python/DBT). Deep knowledge of ETL/ELT processes, CI/CD, and data performance tuning. Strong communication skills, with a knack for translating complex tech … mentoring experience. Nice to have: Hands-on experience with Snowflake, Airflow, AWS Glue, Spark, and S3. Familiarity with open-source data libraries (e.g., Pandas, DBT). Experience with modern data stacks and AWS cloud services. This is your chance to shape the future of our data ecosystem from the ground More ❯
Drive. What will I be doing? Design, build, and maintain scalable and reliable data pipelines. Manage Zeelo's serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business. Design, build, and maintain ETL, ELT and other data pipelines for purposes to support analytics … degrees are a plus. Min 3+ years data engineering experience in a commercial environment. Proficiency in SQL. Experience building SQL-based transformation flows in dbt or similar tools. Good understanding of cloud platforms such as GCP, AWS or Azure. Experience configuring orchestration of SQL and Python via Airflow or similar More ❯
the UK. Responsibilities Technical Leadership Design and implement scalable data architectures and pipelines Work across cloud platforms (Azure, GCP, AWS) and orchestration tools (e.g. dbt, Airflow) Build AI-ready data infrastructure for analytics and data science use cases Lead technical delivery across multiple client engagements Client Engagement & Strategy Act as … with Azure and Databricks (GCP/AWS also valued) Skilled in data governance, ingestion/transformation, and metadata management Familiarity with tools like Airflow, dbt, Power BI, Tableau Exposure to machine learning or advanced analytics techniques Strategic & Leadership Skills Ability to translate technical solutions into business value for senior stakeholders More ❯
the Lead Data Engineer and other members of the Data Engineering team to deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Development … of data ingestion/transformation pipelines using Fivetran, DBT and Gitlab. Creation of management information dashboards. Work with business analysts and end-users to plan and implement feature enhancements and changes to existing systems, processes and data warehouses. Working with internal staff and third parties (suppliers and partners) to plan More ❯
Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. Languages & Scripting: Python … interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: Apache Airflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus Candidate Profile More ❯
Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. Languages & Scripting: Python … interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: Apache Airflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus Candidate Profile More ❯
forward-thinking team where your growth is genuinely encouraged, this could be the perfect next step! Why This Role? ✅ Work with modern tech - Snowflake, DBT, Python, Azure ✅ Be part of a friendly, collaborative, remote-first team ✅ Get real investment in your training & development ✅ Take your career to the next level … native tech Shape reusable data engineering patterns Communicate clearly with both technical and non-technical stakeholders Scope and evolve solutions alongside clients Tech: Snowflake, DBT, Airflow, Azure Data Factory Python, Spark, SQL Cloud (Azure, AWS, or GCP) Power BI, Tableau, Looker Modern architecture concepts Why Join? Remote-first, with a More ❯
forward-thinking team where your growth is genuinely encouraged, this could be the perfect next step! Why This Role? ✅ Work with modern tech - Snowflake, DBT, Python, Azure ✅ Be part of a friendly, collaborative, remote-first team ✅ Get real investment in your training & development ✅ Take your career to the next level … native tech Shape reusable data engineering patterns Communicate clearly with both technical and non-technical stakeholders Scope and evolve solutions alongside clients Tech: Snowflake, DBT, Airflow, Azure Data Factory Python, Spark, SQL Cloud (Azure, AWS, or GCP) Power BI, Tableau, Looker Modern architecture concepts Why Join? Remote-first, with a More ❯
Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. Languages & Scripting: Python … interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: Apache Airflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus Candidate Profile More ❯
Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. Languages & Scripting: Python … interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: Apache Airflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus Candidate Profile More ❯
of scalable data pipelines on GCP and Snowflake . What You'll Do Design, develop, and optimize robust data pipelines using technologies such as dbt, Fivetran, and Airflow . Lead data integration, modeling, and governance initiatives in close collaboration with product, analytics, and data science teams. Act as a technical … leading data engineering projects in cloud environments , ideally GCP and Snowflake . Proficiency in Python and advanced SQL , with experience designing ELT architectures using dbt . Strong knowledge of data modeling , Snowpipe Streaming , warehouse optimization , and orchestration tools like Apache Airflow . Experience working in distributed, cross-functional teams with More ❯
Data Engineer and other members of the Data Engineering team to design and deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Contributing … to the architectural design of the new Enterprise Data Platform. Contributing to development best practices. Development of data ingestion/transformation pipelines using Fivetran, DBT and Gitlab. Creation of management information dashboards. Work with business analysts and end-users to plan and implement feature enhancements and changes to existing systems More ❯
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
london, south east england, united kingdom Hybrid / WFH Options
Undisclosed
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Undisclosed
Role Title: Data Modeler Duration: contract to run until 31/12/2025 Location: London, Hybrid 2-3 days per week onsite Rate: up to £644 p/d Umbrella inside IR35 Role purpose/summary We are seeking More ❯
the Design and Evolution of Data Models: Architecting and overseeing the development of sophisticated, high-performance data models within our cloud data warehouse (BigQuery, dbt), ensuring they are scalable, maintainable, and effectively serve the complex analytical and operational needs of the entire organisation. You will be instrumental in defining and … performant data models in enterprise-grade cloud environments (e.g., BigQuery, Snowflake, Redshift), consistently optimising for scale and cost-efficiency. Demonstrate mastery of SQL and dbt, exhibiting expertise in advanced SQL techniques and extensive experience in developing, optimising, and troubleshooting data transformations within dbt, including more advanced features such as macros … materialisation configurations. Have strong proficiency in ELT processes and data orchestration, evidenced by a thorough understanding of methodologies and significant hands-on experience using dbt, complemented by practical experience with orchestration tools such as Dagster or Airflow. Have a proven track record of successful semantic layer implementation, showcasing experience in More ❯
Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources … You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will More ❯