role where you'll be at the heart of data-driven decisions. Day to day you will be transforming raw data in clean, reliable and performant data models using dbt within Google Cloud Platform (GCP). Your primary goal is to empower our clients' business users by making data accessible, understandable and ready for reporting, dashboarding and ad-hoc analysis.We … and business intelligence. Experience in the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google … quality, optimised SQL for data transformation and analysis.• Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data transformations.• Take ownership of dbt project health, monitoring daily runs and proactively resolving any data, model, or scheduling issues in collaboration with other project owners.• Use version control (Git) and CI/CD to manage More ❯
Senior Data Engineer - Dublin Atrium EMEA is looking for an accomplished Senior Data Engineer to support a client onsite in Dublin. You will builddata pipelines with Databricks and DBT, design data models, and develop PostgreSQL stored procedures for data reconciliation, ensuring data accuracy, efficiency, and integrity across systems. Design and builddata pipelines (Databricks/Spark/DBT) Orchestrate … Create Docker images for various applications and deploy them on Azure Design and build best in class processes to clean and standardize data. Design and builddata tests using DBT test Troubleshoot production issues in our Azure Environment Tuning and optimizing data processes Essential: Hands-on experience of DBT SQL models development Developing processes in Spark/Databricks Writing complex More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
london (city of london), south east england, united kingdom
HCLTech
will build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … related field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Birchwell Associates Ltd
DataOps culture of automation, reliability, and agility. Key Responsibilities Design, build, and optimise data pipelines across a modern data platform. Ingest, clean, and transform data using tools such as dbt, Snowflake, and Airflow . Collaborate with cross-functional teams to deliver data products aligned to business priorities. Develop scalable data models that support BI and analytics platforms including Tableau and … including ISO 27001, BS 10012, ISO 50001, and ISO 22301 . Skills & Experience Strong SQL expertise, with the ability to write and optimise complex queries. Hands-on experience with dbt (including testing and layered modelling). Practical knowledge of Snowflake for loading, transforming, and exporting datasets. Experience building and managing Airflow DAGs for pipeline orchestration. Understanding of BI tool requirements More ❯
standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE: A successful Senior Data Engineer will … bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending on experience, along with a comprehensive benefits More ❯
and optimise existing data pipelines, reducing manual effort and improving reliability Lead on structuring data to align with performance, delivery, and leadership metrics Work with cloud-native tooling (AWS, dbt, Airflow) to build scalable data infrastructure Collaborate with digital, performance, and portfolio teams to drive cross-functional delivery Champion data quality, completeness, and usability at every stage of the data … Copilot and ChatGPT to enhance delivery Skills & Experience - Senior Data Engineer Strong hands-on experience building and maintaining automated data pipelines Advanced proficiency in Python and SQL Experience with dbt and strong understanding of data modelling (e.g., star schema) Proficient with orchestration tools such as Airflow Comfortable working with AWS services (Glue, S3, etc.) or similar cloud platforms Experience with More ❯
assessment, product development, customer growth, and operational efficiency through data-driven insights, supporting our expansion efforts. Build & Scale: Own and evolve our modern data stack (centred on Snowflake and dbt), ensuring our analytics capabilities scale with Wayflyer's rapid growth and diversification. Lead & Mentor: Develop a high-performing team, fostering a culture of technical excellence, collaboration, and continuous learning within … prioritising initiatives that deliver maximum value to Wayflyer. Technical Oversight & Data Platform: Oversee the architecture, development, and maintenance of our analytics infrastructure, including our Snowflake Data Warehouse and our dbt-centric transformation layer. Ensure data quality, governance, scalability, and performance. Insight Generation & Delivery: Partner closely with stakeholders across Product, Risk, Sales, Marketing, and Operations to understand their needs, translate them More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
/experience required: Strong experience designing, building and maintaining data pipelines using modern, cloud-based tools and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3 … process, if there is a better way for us to communicate, please do let us know. Data, Database, Engineer, Lead, Manager, Data Science, Data Architect, Business Intelligence, Python, SQL, DBT, Data Model, Data Modelling, AWS, Security Check, Sc Level, Sc Cleared, Sc Clearance, Security Cleared, Security Clearance, Security Vetting Clearance, Active SC, SC Vetted, Cleared To A High Government Standard More ❯
We are seeking a Senior Data Engineer to lead the design, implementation, and optimization of our modern data stack. You'll work with tools like Snowflake, dbt, Airflow, and Terraform to build scalable, reliable, and modular data systems. This role will have a strong focus on enabling analytics through clean data modelling, automation, and observability - empowering domain teams with trusted … containerized jobs on Docker and Kubernetes. Use Terraform to define infrastructure as code for consistent, version-controlled deployments. Pipeline Engineering & Automation Design robust ELT pipelines using Python, Spark, and dbt, processing structured and semi-structured data (e.g., JSON, Parquet). Automate ingestion and transformation layers while enforcing data contracts, quality rules, and schema validation. Build reusable, testable modules to reduce … development effort and improve standardization. Data Modelling & Analytics Enablement Own the semantic layer using dbt for transformation and modelling in Snowflake, including SCD and dimensional designs. Develop curated, documented, and test-covered data models that serve as trusted sources for analytics and reporting. Enable self-service analytics by partnering with analysts and product teams to understand data needs and deliver More ❯
We are seeking a Senior Data Engineer to lead the design, implementation, and optimization of our modern data stack. You'll work with tools like Snowflake, dbt, Airflow, and Terraform to build scalable, reliable, and modular data systems. This role will have a strong focus on enabling analytics through clean data modelling, automation, and observability - empowering domain teams with trusted … containerized jobs on Docker and Kubernetes. Use Terraform to define infrastructure as code for consistent, version-controlled deployments. Pipeline Engineering & Automation Design robust ELT pipelines using Python, Spark, and dbt, processing structured and semi-structured data (e.g., JSON, Parquet). Automate ingestion and transformation layers while enforcing data contracts, quality rules, and schema validation. Build reusable, testable modules to reduce … development effort and improve standardization. Data Modelling & Analytics Enablement Own the semantic layer using dbt for transformation and modelling in Snowflake, including SCD and dimensional designs. Develop curated, documented, and test-covered data models that serve as trusted sources for analytics and reporting. Enable self-service analytics by partnering with analysts and product teams to understand data needs and deliver More ❯
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
london (city of london), south east england, united kingdom
TGS International Group
Define and uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics … similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth A proactive self-starter who thrives More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Harnham
Data Analyst Location: Remote-first (monthly visit to Nottingham HQ – travel expensed) Salary: £30,000–£45,000 (DOE) The Company A fast-growing UK consumer lender recently acquired by a global fintech, focused on transparent, short-term credit that helps More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Data Analyst Location: Remote-first (monthly visit to Nottingham HQ - travel expensed) Salary: £30,000-£45,000 (DOE) The Company A fast-growing UK consumer lender recently acquired by a global fintech, focused on transparent, short-term credit that helps More ❯
Risk Reporting Data Engineering Lead Central London/Hybrid Financial Risk Data/Data Analytics/International Banking Base salary: c. £135k + bonus + comprehensive bens. As a tech recruitment partner for this international bank, we're assisting in More ❯
City of London, London, Coleman Street, United Kingdom
Deerfoot Recruitment Solutions Limited
Risk Reporting Data Engineering Lead Central London/Hybrid Financial Risk Data/Data Analytics/International Banking Base salary: c. £135k + bonus + comprehensive bens. As a tech recruitment partner for this international bank, we're assisting in More ❯
Employment Type: Permanent
Salary: £135000/annum bonus + good benefits package
Risk Reporting Data Engineering Lead Central London/Hybrid Financial Risk Data/Data Analytics/International Banking Base salary: c. £135k + bonus + comprehensive bens. As a tech recruitment partner for this international bank, we're assisting in More ❯
City of London, London, Coleman Street, United Kingdom
Deerfoot Recruitment Solutions Limited
Risk Reporting Data Engineering Lead Central London/Hybrid Financial Risk Data/Data Analytics/International Banking Base salary: c. £135k + bonus + comprehensive bens. As a tech recruitment partner for this international bank, we're assisting in More ❯
Employment Type: Permanent
Salary: £135000/annum bonus + good benefits package