recent experience working with Suppliers, AR/AP invoices and General Ledger Working knowledge of SQL and the ability to write your own code essential - experience working with Snowflake, dbt and Github strongly preferred Experience working with US data and requirements strongly preferred If you can demonstrate the above knowledge from recent assignments and are soon available for an outside More ❯
Leicester, Leicestershire, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
exciting time to be part of a high-impact data transformation. The Role: Support the migration from Tableau to Lightdash (similar to Looker) Use Lightdash/Looker, LookML, and DBT for data modelling and reporting Drive self-service analytics across the organisation Deliver actionable insights through engaging dashboards Collaborate closely with product, analytics, and BI teams About You: 5+ years More ❯
Horsforth, Leeds, West Yorkshire, England, United Kingdom
TPP (The Phoenix Partnership)
product and service becomes and remains fully compliant with current standards, policies and regulations• Utilising your broad knowledge of health and social care processes to providing support for the DBT approach to new software productionApplicants should be clinically qualified and can show a breadth of knowledge in relation to the role as well as a demonstrable interest in clinical safety More ❯
pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (DataBuildTool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive More ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
london (city of london), south east england, united kingdom
Capgemini
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
data engineering capabilities. Looking at our current pipeline of work, we can also consider those with an Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial. About You As much as we love working with great, fun people, there are some required skills and experience we are … Storage, Medallion Architecture, and data formats such as JSON, CSV, and Parquet. Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. Exposure to Apache Airflow and DBT is a bonus. Familiarity with agile principles and practices. Experience with Azure DevOps pipelines. The "Nice to Haves" Certification in Azure or related technologies. Experience with other cloud platforms (e.g. More ❯
Lead Data Engineer London, UK (Hybrid) Innovative FinTech About the Company: We are partnered with a high-growth FinTech company headquartered in London that is redefining how technology and data can transform financial services. Their mission is to deliver smarter More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Lead Data Engineer London, UK (Hybrid) Innovative FinTech About the Company: We are partnered with a high-growth FinTech company headquartered in London that is redefining how technology and data can transform financial services. Their mission is to deliver smarter More ❯
an initial 6 months experienced based out of their London office. You will play a key role in designing and managing our Snowflake data warehouse and leveraging dbt (DataBuildTool) to transform raw data into reliable, analysis-ready datasets that support regulatory compliance, operational efficiency, and innovation. Responsibilities: Design, develop, and maintain scalable data pipelines to support manufacturing, quality … and supply chain data workflows. Implement and manage data transformation models using dbt to standardise and validate datasets. Optimise and monitor performance of Snowflake data warehouse environments. Collaborate with cross-functional teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g. … z2bz0 years of experience as a Data Engineer, ideally in a pharmaceutical, biotech, or regulated manufacturing environment. Strong hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure More ❯
a hands-on role which requires delivery expertise using Google Cloud Platform data technologies, particularly focussing on Cloud Storage, Big Query, using containerisation (GKE) and additionally DataBuildTool (DBT) for pipeline processing. Responsibilities Design and develop scalable, reliable, efficient and cost-effective data pipelines and ELT processes on the GCP platform. GCP Professional Data Engineer certification essential Optimise and More ❯
london (city of london), south east england, united kingdom
83data
more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. Youll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone whos … automation, data processing, and ETL/ELT development. Snowflake scalable data architecture, performance optimisation, and governance. SQL expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience - strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience - strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You’ll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) — modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience — strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python, Snowflake, SQL, and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain More ❯
Lead Data Engineer Python, DBT, Airflow, Terraform, GCP/AWS London (Hybrid 1-2 Days) £120K-£140K OB is partnered with an exciting Unicorn FinTech that are on a mission to transform lending with a central focus on data. As a Lead Engineer, you will set the technical direction for our data platform and shape how data enables every part … the bar across the function. Stay hands-on when needed: you build it, you run it. What you'll work on Building robust ELT pipelines and data models using DBT and BigQuery. Scaling a cloud-native data platform with strong automation and monitoring. Improving data lineage, governance, reliability, and security. Enabling self-service capabilities for product and analytics teams. Shaping … fit if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯
Lead Data Engineer | Python, DBT, Airflow, Terraform, GCP/AWS | London (Hybrid 1-2 Days) | £120K-£140K OB is partnered with an exciting Unicorn FinTech that are on a mission to transform lending with a central focus on data. As a Lead Engineer, you will set the technical direction for our data platform and shape how data enables every part … the bar across the function. Stay hands-on when needed: you build it, you run it. What you’ll work on Building robust ELT pipelines and data models using DBT and BigQuery. Scaling a cloud-native data platform with strong automation and monitoring. Improving data lineage, governance, reliability, and security. Enabling self-service capabilities for product and analytics teams. Shaping … fit if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯
Lead Data Engineer | Python, DBT, Airflow, Terraform, GCP/AWS | London (Hybrid 1-2 Days) | £120K-£140K OB is partnered with an exciting Unicorn FinTech that are on a mission to transform lending with a central focus on data. As a Lead Engineer, you will set the technical direction for our data platform and shape how data enables every part … the bar across the function. Stay hands-on when needed: you build it, you run it. What you’ll work on Building robust ELT pipelines and data models using DBT and BigQuery. Scaling a cloud-native data platform with strong automation and monitoring. Improving data lineage, governance, reliability, and security. Enabling self-service capabilities for product and analytics teams. Shaping … fit if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯
shape how data flows through everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability … and ML-ready datasets Influence technical decisions across the stack — they love new ideas What You Bring Hands-on experience in Python and SQL Experience with modern data tools (dbt, Airflow, Prefect, Dagster, etc.) Knowledge of cloud platforms like AWS , GCP , or Azure An understanding of data modelling and ETL best practices Curiosity, creativity, and a mindset that thrives in More ❯
shape how data flows through everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability … and ML-ready datasets Influence technical decisions across the stack — they love new ideas What You Bring Hands-on experience in Python and SQL Experience with modern data tools (dbt, Airflow, Prefect, Dagster, etc.) Knowledge of cloud platforms like AWS , GCP , or Azure An understanding of data modelling and ETL best practices Curiosity, creativity, and a mindset that thrives in More ❯
london (city of london), south east england, united kingdom
Digital Waffle
shape how data flows through everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability … and ML-ready datasets Influence technical decisions across the stack — they love new ideas What You Bring Hands-on experience in Python and SQL Experience with modern data tools (dbt, Airflow, Prefect, Dagster, etc.) Knowledge of cloud platforms like AWS , GCP , or Azure An understanding of data modelling and ETL best practices Curiosity, creativity, and a mindset that thrives in More ❯