unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
london (city of london), south east england, united kingdom
Capgemini
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
an initial 6 months experienced based out of their London office. You will play a key role in designing and managing our Snowflake data warehouse and leveraging dbt (DataBuildTool) to transform raw data into reliable, analysis-ready datasets that support regulatory compliance, operational efficiency, and innovation. Responsibilities: Design, develop, and maintain scalable data pipelines to support manufacturing, quality … and supply chain data workflows. Implement and manage data transformation models using dbt to standardise and validate datasets. Optimise and monitor performance of Snowflake data warehouse environments. Collaborate with cross-functional teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g. … z2bz0 years of experience as a Data Engineer, ideally in a pharmaceutical, biotech, or regulated manufacturing environment. Strong hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure More ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience - strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
transformation, and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production More ❯
transformation, and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production More ❯
transformation, and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production More ❯
london (city of london), south east england, united kingdom
Kubrick Group
transformation, and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS … Skills & Experience 6 years+ working with Snowflake. Expertise in data modelling, query optimisation and Snowflake-specific features (virtual warehouses, time travel, etc.). Hands-on experience with dbt (DataBuildTool) for data transformations. Proficient in SQL and Python for data manipulation and automation. Knowledge of Azure Data Factory or similar orchestration tools. Exposure to large-scale or regulated environments More ❯
Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS … Skills & Experience 6 years+ working with Snowflake. Expertise in data modelling, query optimisation and Snowflake-specific features (virtual warehouses, time travel, etc.). Hands-on experience with dbt (DataBuildTool) for data transformations. Proficient in SQL and Python for data manipulation and automation. Knowledge of Azure Data Factory or similar orchestration tools. Exposure to large-scale or regulated environments More ❯
london, south east england, united kingdom Hybrid / WFH Options
Omnis Partners
Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS … Skills & Experience 6 years+ working with Snowflake. Expertise in data modelling, query optimisation and Snowflake-specific features (virtual warehouses, time travel, etc.). Hands-on experience with dbt (DataBuildTool) for data transformations. Proficient in SQL and Python for data manipulation and automation. Knowledge of Azure Data Factory or similar orchestration tools. Exposure to large-scale or regulated environments More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Omnis Partners
Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS … Skills & Experience 6 years+ working with Snowflake. Expertise in data modelling, query optimisation and Snowflake-specific features (virtual warehouses, time travel, etc.). Hands-on experience with dbt (DataBuildTool) for data transformations. Proficient in SQL and Python for data manipulation and automation. Knowledge of Azure Data Factory or similar orchestration tools. Exposure to large-scale or regulated environments More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP OB have partnered with a UK FinTech Unicorn, who's Data function is undergoing rapid growth, and in-turn are looking to grow their Data team, with 2 highly skilled Data Engineers. You'll be working on shaping the companies Data function, driving Data best practices, and collaborate … with a variety of stakeholders to ensure the company are making Data driven decisions. Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP Tech Stack: Python & SQL DBT, AirFlow & BigQuery AWS/GCP ETL Pipelines Base salary of £90k-£115k depending on skills and experience Excellent overall package including a sizeable bonus and stock options Hybrid working … in Central London with 1-2 days a week required You must be UK based, and sadly sponsorship is unavailable Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Data Engineer (Azure, Snowflake, DBT) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced Data Engineer to join a major insurance client engagement.The role focuses on building out a Snowflake Data Warehouse established last year and scaling it to support multiple new data use cases across … umbrella company admin fees) Start Date: Immediate/End of June Role Overview You'll be working within a growing data engineering function, focused on scaling a Snowflake + DBT platform to support multiple analytical and operational use cases. The team is looking for an experienced engineer with strong technical depth and an insurance background, capable of owning and extending … the Azure and Snowflake stack. Key Skills & Experience Strong hands-on experience with Snowflake Cloud Data Warehouse (schemas, RBAC, performance tuning, ELT best practices). Proven commercial experience with DBT for modular data modelling, testing, documentation, and CI/CD integration. Skilled in Azure Data Factory, Synapse, and Databricks for end-to-end data pipeline orchestration. Excellent SQL engineering capability More ❯
in product, finance, and technology to translate complex business questions into accessible, trustworthy datasets and visualisations. What You’ll Do Build and optimise robust data models in SQL and dbt Develop automated data pipelines and ensure strong data governance standards Partner with analysts, product managers, and engineers to develop data solutions and dashboards Create impactful reports and visualisations using tools … accuracy and accessibility About You 4+ years’ experience in data or analytics engineering, ideally within fintech, SaaS, or digital platforms Deep proficiency in SQL and hands-on experience with dbt Experience working with modern data warehouses (Snowflake, BigQuery, or Redshift) Familiarity with software engineering principles (Git, CI/CD, testing frameworks) Strong understanding of data visualisation and BI tools Confident More ❯
in product, finance, and technology to translate complex business questions into accessible, trustworthy datasets and visualisations. What You’ll Do Build and optimise robust data models in SQL and dbt Develop automated data pipelines and ensure strong data governance standards Partner with analysts, product managers, and engineers to develop data solutions and dashboards Create impactful reports and visualisations using tools … accuracy and accessibility About You 4+ years’ experience in data or analytics engineering, ideally within fintech, SaaS, or digital platforms Deep proficiency in SQL and hands-on experience with dbt Experience working with modern data warehouses (Snowflake, BigQuery, or Redshift) Familiarity with software engineering principles (Git, CI/CD, testing frameworks) Strong understanding of data visualisation and BI tools Confident More ❯
in product, finance, and technology to translate complex business questions into accessible, trustworthy datasets and visualisations. What You’ll Do Build and optimise robust data models in SQL and dbt Develop automated data pipelines and ensure strong data governance standards Partner with analysts, product managers, and engineers to develop data solutions and dashboards Create impactful reports and visualisations using tools … accuracy and accessibility About You 4+ years’ experience in data or analytics engineering, ideally within fintech, SaaS, or digital platforms Deep proficiency in SQL and hands-on experience with dbt Experience working with modern data warehouses (Snowflake, BigQuery, or Redshift) Familiarity with software engineering principles (Git, CI/CD, testing frameworks) Strong understanding of data visualisation and BI tools Confident More ❯
london (city of london), south east england, united kingdom
Northreach
in product, finance, and technology to translate complex business questions into accessible, trustworthy datasets and visualisations. What You’ll Do Build and optimise robust data models in SQL and dbt Develop automated data pipelines and ensure strong data governance standards Partner with analysts, product managers, and engineers to develop data solutions and dashboards Create impactful reports and visualisations using tools … accuracy and accessibility About You 4+ years’ experience in data or analytics engineering, ideally within fintech, SaaS, or digital platforms Deep proficiency in SQL and hands-on experience with dbt Experience working with modern data warehouses (Snowflake, BigQuery, or Redshift) Familiarity with software engineering principles (Git, CI/CD, testing frameworks) Strong understanding of data visualisation and BI tools Confident More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
iO Associates
customers. The right candidate will be well versed in Snowflake (including Snowpro certifications) as a Data Engineer, with strong SQL and Python skills, as well as tools such as dbt and Airflow. Additional skills in The successful candidate should have the following skills: Extensive hands-on experience with Snowflake data platform. Proficiency in SQL and ETL/ELT processes. Strong … programming skills in Python. Extended skills across AWS, Azure, dbt, Airflow etc. Could this be of interest? If so, please get in touch with Alex at iO Associates. On this occasion, we cannot accept applications from candidates outside of the UK, nor from those without existing right to work in the UK. More ❯
london, south east england, united kingdom Hybrid / WFH Options
iO Associates
customers. The right candidate will be well versed in Snowflake (including Snowpro certifications) as a Data Engineer, with strong SQL and Python skills, as well as tools such as dbt and Airflow. Additional skills in The successful candidate should have the following skills: Extensive hands-on experience with Snowflake data platform. Proficiency in SQL and ETL/ELT processes. Strong … programming skills in Python. Extended skills across AWS, Azure, dbt, Airflow etc. Could this be of interest? If so, please get in touch with Alex at iO Associates. On this occasion, we cannot accept applications from candidates outside of the UK, nor from those without existing right to work in the UK. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
customers. The right candidate will be well versed in Snowflake (including Snowpro certifications) as a Data Engineer, with strong SQL and Python skills, as well as tools such as dbt and Airflow. Additional skills in The successful candidate should have the following skills: Extensive hands-on experience with Snowflake data platform. Proficiency in SQL and ETL/ELT processes. Strong … programming skills in Python. Extended skills across AWS, Azure, dbt, Airflow etc. Could this be of interest? If so, please get in touch with Alex at iO Associates. On this occasion, we cannot accept applications from candidates outside of the UK, nor from those without existing right to work in the UK. More ❯
North London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will be … hands on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + bonus … week in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
for senior stakeholders. Work closely with both technical and non-technical teams to gather requirements and translate them into compelling data stories. Cleanse, profile, and model data using SQL, DBT, and Snowflake. Contribute to projects involving cloud data infrastructure (AWS preferred, especially S3; experience with other platforms also valued). Support the evolution of data processes - potential to get involved … in broader SQL, DBT, and AWS financial data projects. What we’re looking for: Proven expertise in Tableau (development and design), with a strong portfolio or Tableau Public profile. Advanced SQL skills; hands-on experience with Snowflake and DBT is a plus. Experience working with large datasets and performing data profiling and cleansing. Familiarity with AWS (especially S3; Lambda less More ❯