London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
and performance Ensure compliance with data governance policies including RBAC, encryption, and audit logging Collaborate with data engineering teams to streamline provisioning and reduce overhead Stakeholder Engagement: Liaise with DBT’s platform, data, and finance teams to align technical decisions with business objectives Present cost-saving recommendations and technical roadmaps to senior stakeholders Document architecture decisions and maintain technical artefacts More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
london (city of london), south east england, united kingdom
HCLTech
Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (databuildtool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and … Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms More ❯
Horsforth, Leeds, West Yorkshire, England, United Kingdom
TPP (The Phoenix Partnership)
product and service becomes and remains fully compliant with current standards, policies and regulations• Utilising your broad knowledge of health and social care processes to providing support for the DBT approach to new software productionApplicants should be clinically qualified and can show a breadth of knowledge in relation to the role as well as a demonstrable interest in clinical safety More ❯
role where you'll be at the heart of data-driven decisions. Day to day you will be transforming raw data in clean, reliable and performant data models using dbt within Google Cloud Platform (GCP). Your primary goal is to empower our clients' business users by making data accessible, understandable and ready for reporting, dashboarding and ad-hoc analysis.We … and business intelligence. Experience in the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google … quality, optimised SQL for data transformation and analysis.• Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data transformations.• Take ownership of dbt project health, monitoring daily runs and proactively resolving any data, model, or scheduling issues in collaboration with other project owners.• Use version control (Git) and CI/CD to manage More ❯
maintain data pipelines, transform raw datasets into usable insights, and make data accessible for analytics and decision-making. What youll do Build and optimize data pipelines with tools like dbt, Snowflake, and Airflow Develop scalable data models for self-service BI (Tableau, Power BI) Collaborate with cross-functional teams to deliver data solutions Apply best practices in software development and … DataOps Ensure compliance with data security and governance standards What were looking for Strong SQL skills with experience across multiple sources Hands-on experience with dbt, Snowflake, and Airflow Knowledge of BI tools and data modelling best practices Solid Excel skills Collaborative, proactive, and eager to work with evolving tech Nice to have Cloud platform experience ( AWS, Azure, GCP ) Familiarity More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
london (city of london), south east england, united kingdom
HCLTech
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Birchwell Associates Ltd
DataOps culture of automation, reliability, and agility. Key Responsibilities Design, build, and optimise data pipelines across a modern data platform. Ingest, clean, and transform data using tools such as dbt, Snowflake, and Airflow . Collaborate with cross-functional teams to deliver data products aligned to business priorities. Develop scalable data models that support BI and analytics platforms including Tableau and … including ISO 27001, BS 10012, ISO 50001, and ISO 22301 . Skills & Experience Strong SQL expertise, with the ability to write and optimise complex queries. Hands-on experience with dbt (including testing and layered modelling). Practical knowledge of Snowflake for loading, transforming, and exporting datasets. Experience building and managing Airflow DAGs for pipeline orchestration. Understanding of BI tool requirements More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
Role: Principal Data Engineer (DBT lean)Salary: £70,000 - £90,000 (dependant on experience)Contracting and Permanent hires considered Location: ALL locations considered: We have offices in Liverpool or Newbury office UK for Hybrid working and Sibenik (Croatia)Let us introduce ourselves We're Intuita, a new kind of data partner. We're a collective of data-driven people ready … developing scalable cloud solutions in Azure, GCP or AWS. Data Modelling using Kimball, 3NF or Dimensional methodologies Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial, with deep DBT experience being highly beneficial Depth of knowledge and understanding across core orchestration tools and CI/CD pipelines More ❯
standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE: A successful Senior Data Engineer will … bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending on experience, along with a comprehensive benefits More ❯
and optimise existing data pipelines, reducing manual effort and improving reliability Lead on structuring data to align with performance, delivery, and leadership metrics Work with cloud-native tooling (AWS, dbt, Airflow) to build scalable data infrastructure Collaborate with digital, performance, and portfolio teams to drive cross-functional delivery Champion data quality, completeness, and usability at every stage of the data … Copilot and ChatGPT to enhance delivery Skills & Experience - Senior Data Engineer Strong hands-on experience building and maintaining automated data pipelines Advanced proficiency in Python and SQL Experience with dbt and strong understanding of data modelling (e.g., star schema) Proficient with orchestration tools such as Airflow Comfortable working with AWS services (Glue, S3, etc.) or similar cloud platforms Experience with More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Medialab Group
TO THE TEAM You bring a deep understanding of how to transform raw data into clean, reliable and analysis-ready datasets. You’re fluent in SQL and experienced with DBT, with a strong appreciation for scalable, well-structured models that encode business logic clearly and consistently. You think analytically and work methodically ensuring your transformations enable reporting, analytics and activation … use cases with confidence. YOUR NEXT CHALLENGE As an Analytics Engineer at Medialab, you will: Design, build and maintain data models within our DBT codebase, transforming raw data into clean, tested, and documented datasets ready for analytics and activation. Onboard new data feeds, working closely with business and technical stakeholders to understand requirements and ensure smooth integration into our pipelines. … Support users and team members, sharing knowledge and solving problems effectively. Must Have Skills SQL – proficient; confident in writing performant, well-structured SQL with full understanding of ANSI standards. DBT – strong working knowledge; experienced in building, testing, and documenting models as the central tool for transformation and semantic modelling. Python – some experience; able to use and adapt scripts and tooling More ❯
london, south east england, united kingdom Hybrid / WFH Options
Medialab Group
TO THE TEAM You bring a deep understanding of how to transform raw data into clean, reliable and analysis-ready datasets. You’re fluent in SQL and experienced with DBT, with a strong appreciation for scalable, well-structured models that encode business logic clearly and consistently. You think analytically and work methodically ensuring your transformations enable reporting, analytics and activation … use cases with confidence. YOUR NEXT CHALLENGE As an Analytics Engineer at Medialab, you will: Design, build and maintain data models within our DBT codebase, transforming raw data into clean, tested, and documented datasets ready for analytics and activation. Onboard new data feeds, working closely with business and technical stakeholders to understand requirements and ensure smooth integration into our pipelines. … Support users and team members, sharing knowledge and solving problems effectively. Must Have Skills SQL – proficient; confident in writing performant, well-structured SQL with full understanding of ANSI standards. DBT – strong working knowledge; experienced in building, testing, and documenting models as the central tool for transformation and semantic modelling. Python – some experience; able to use and adapt scripts and tooling More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Medialab Group
TO THE TEAM You bring a deep understanding of how to transform raw data into clean, reliable and analysis-ready datasets. You’re fluent in SQL and experienced with DBT, with a strong appreciation for scalable, well-structured models that encode business logic clearly and consistently. You think analytically and work methodically ensuring your transformations enable reporting, analytics and activation … use cases with confidence. YOUR NEXT CHALLENGE As an Analytics Engineer at Medialab, you will: Design, build and maintain data models within our DBT codebase, transforming raw data into clean, tested, and documented datasets ready for analytics and activation. Onboard new data feeds, working closely with business and technical stakeholders to understand requirements and ensure smooth integration into our pipelines. … Support users and team members, sharing knowledge and solving problems effectively. Must Have Skills SQL – proficient; confident in writing performant, well-structured SQL with full understanding of ANSI standards. DBT – strong working knowledge; experienced in building, testing, and documenting models as the central tool for transformation and semantic modelling. Python – some experience; able to use and adapt scripts and tooling More ❯
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
london (city of london), south east england, united kingdom
TGS International Group
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Harnham
Data Analyst Location: Remote-first (monthly visit to Nottingham HQ – travel expensed) Salary: £30,000–£45,000 (DOE) The Company A fast-growing UK consumer lender recently acquired by a global fintech, focused on transparent, short-term credit that helps More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Data Analyst Location: Remote-first (monthly visit to Nottingham HQ - travel expensed) Salary: £30,000-£45,000 (DOE) The Company A fast-growing UK consumer lender recently acquired by a global fintech, focused on transparent, short-term credit that helps More ❯
City of London, London, Coleman Street, United Kingdom
Deerfoot Recruitment Solutions Limited
Risk Reporting Data Engineering Lead Central London/Hybrid Financial Risk Data/Data Analytics/International Banking Base salary: c. £135k + bonus + comprehensive bens. As a tech recruitment partner for this international bank, we're assisting in More ❯
Employment Type: Permanent
Salary: £135000/annum bonus + good benefits package
Huddersfield, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Oscar Associates (UK) Limited
client base. Your responsibilities will cover: Develop data models to support company Business Intelligence Write and optimize complex SQL queries Build, maintain and improve data pipelines Transform data using DBT, Snowflake and Airflow Ensure data is handled correctly and to relevant standards Collaborate with tech teams to collectively solve shared data challenges Key Skills SQL DBT Snowflake Airflow DAG Cloud More ❯
Northampton, Northamptonshire, United Kingdom Hybrid / WFH Options
Experis
seeking a highly skilled and communicative Technical Data Engineer to join our team. The ideal candidate will have hands-on experience with modern data platforms and tools including Databricks, DBT, and Snowflake. You will play a key role in designing, developing, and optimizing data pipelines and analytics solutions that drive business insights and decision-making. Key Responsibilities: Design, build, and … data-related issues in a timely manner. Document processes, workflows, and technical specifications clearly and effectively. Required Skills & Experience: Proven hands-on experience with: Databricks (Spark, Delta Lake, notebooks) DBT (data modeling, transformations, testing) Snowflake (SQL, performance tuning, data warehousing) Strong understanding of data engineering principles and best practices. Excellent communication skills with the ability to explain technical concepts to More ❯