Strong SQL skills with experience optimising queries Hands-on experience with Looker (or similar tools such as Tableau, Power BI, or QlikView) Experience in data modelling and familiarity with dbt Excellent communication skills in English Comfortable working autonomously in a fast-paced, international environment What’s on Offer: Competitive compensation, including equity Generous holiday allowance Health and wellbeing benefits Flexible More ❯
Strong SQL skills with experience optimising queries Hands-on experience with Looker (or similar tools such as Tableau, Power BI, or QlikView) Experience in data modelling and familiarity with dbt Excellent communication skills in English Comfortable working autonomously in a fast-paced, international environment What’s on Offer: Competitive compensation, including equity Generous holiday allowance Health and wellbeing benefits Flexible More ❯
london (city of london), south east england, united kingdom
Xcede
Strong SQL skills with experience optimising queries Hands-on experience with Looker (or similar tools such as Tableau, Power BI, or QlikView) Experience in data modelling and familiarity with dbt Excellent communication skills in English Comfortable working autonomously in a fast-paced, international environment What’s on Offer: Competitive compensation, including equity Generous holiday allowance Health and wellbeing benefits Flexible More ❯
Leicester, Leicestershire, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
exciting time to be part of a high-impact data transformation. The Role: Support the migration from Tableau to Lightdash (similar to Looker) Use Lightdash/Looker, LookML, and DBT for data modelling and reporting Drive self-service analytics across the organisation Deliver actionable insights through engaging dashboards Collaborate closely with product, analytics, and BI teams About You: 5+ years More ❯
Horsforth, Leeds, West Yorkshire, England, United Kingdom
TPP (The Phoenix Partnership)
product and service becomes and remains fully compliant with current standards, policies and regulations• Utilising your broad knowledge of health and social care processes to providing support for the DBT approach to new software productionApplicants should be clinically qualified and can show a breadth of knowledge in relation to the role as well as a demonstrable interest in clinical safety More ❯
pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (DataBuildTool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive More ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
london (city of london), south east england, united kingdom
Capgemini
unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (DataBuildTool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics … and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and … Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (dataMore ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
data engineering capabilities. Looking at our current pipeline of work, we can also consider those with an Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial. About You As much as we love working with great, fun people, there are some required skills and experience we are … Storage, Medallion Architecture, and data formats such as JSON, CSV, and Parquet. Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. Exposure to Apache Airflow and DBT is a bonus. Familiarity with agile principles and practices. Experience with Azure DevOps pipelines. The "Nice to Haves" Certification in Azure or related technologies. Experience with other cloud platforms (e.g. More ❯
Senior Data Engineer (Python AWS SQL DBT) Cheshire to £90k Are you a data technologist looking for an opportunity to progress your career in a hands-on, impactful role with lots of ownership? You could be joining a Cybersecurity technology company and enjoying a huge range of perks and benefits from continual learning and self-development opportunities (including "buy any … services You have an indepth knowledge of SQL and Python You have strong hands-on experience of building scalable data pipelines in cloud based environments using tools such as DBT, AWS Glue, AWS Lake Formation, Apache Spark and Amazon Redshift You have a good knowledge of data modelling, ELT design patterns, data governance and security best practices You're collaborative … and prescriptions Flexible working hours 25 days holiday Charitable donations matching scheme and much more Apply now to find out more about this Senior Data Engineer (Python AWS SQL DBT) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come More ❯
Senior Data Engineer (Python AWS SQL DBT) Cheshire to £90k Are you a data technologist looking for an opportunity to progress your career in a hands-on, impactful role with lots of ownership? You could be joining a Cybersecurity technology company and enjoying a huge range of perks and benefits from continual learning and self-development opportunities (including "buy any … services You have an indepth knowledge of SQL and Python You have strong hands-on experience of building scalable data pipelines in cloud based environments using tools such as DBT, AWS Glue, AWS Lake Formation, Apache Spark and Amazon Redshift You have a good knowledge of data modelling, ELT design patterns, data governance and security best practices You're collaborative … and prescriptions Flexible working hours 25 days holiday Charitable donations matching scheme and much more Apply now to find out more about this Senior Data Engineer (Python AWS SQL DBT) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come More ❯
Lead Data Engineer London, UK (Hybrid) Innovative FinTech About the Company: We are partnered with a high-growth FinTech company headquartered in London that is redefining how technology and data can transform financial services. Their mission is to deliver smarter More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Lead Data Engineer London, UK (Hybrid) Innovative FinTech About the Company: We are partnered with a high-growth FinTech company headquartered in London that is redefining how technology and data can transform financial services. Their mission is to deliver smarter More ❯
an initial 6 months experienced based out of their London office. You will play a key role in designing and managing our Snowflake data warehouse and leveraging dbt (DataBuildTool) to transform raw data into reliable, analysis-ready datasets that support regulatory compliance, operational efficiency, and innovation. Responsibilities: Design, develop, and maintain scalable data pipelines to support manufacturing, quality … and supply chain data workflows. Implement and manage data transformation models using dbt to standardise and validate datasets. Optimise and monitor performance of Snowflake data warehouse environments. Collaborate with cross-functional teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g. … z2bz0 years of experience as a Data Engineer, ideally in a pharmaceutical, biotech, or regulated manufacturing environment. Strong hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure More ❯
technical solutions that support critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn … analytics engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
a hands-on role which requires delivery expertise using Google Cloud Platform data technologies, particularly focussing on Cloud Storage, Big Query, using containerisation (GKE) and additionally DataBuildTool (DBT) for pipeline processing. Responsibilities Design and develop scalable, reliable, efficient and cost-effective data pipelines and ELT processes on the GCP platform. GCP Professional Data Engineer certification essential Optimise and More ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience - strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
london (city of london), south east england, united kingdom
83data
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You’ll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) — modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience — strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You’ll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) — modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain … Analyst to Data Engineer. Proven Fintech or Payments industry experience — strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, dataMore ❯
engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python, Snowflake, SQL, and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who … automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (DataBuildTool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Adecco
Cloud Data & Full Stack Engineer - SC Cloud, Data Location: City of London (Hybrid) Salary: £72,000 - £100,000 (depending on experience) + attractive benefits Flexibility: Happy to consider candidates who can commute into London from other UK locations Security: Essential More ❯