week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
new Snowflake Data Warehouse . They are looking for a candidate that has experience in... AWS Data Platform, Strong knowledge of Snowflake, S3, Lambada, Data Modelling, DevOps Practices, Ariflow, DBT, Data Vault, Redshift, ODS, Data Vault experience, Strong SQL/Python. This role is an urgent requirement, there are limited interview slots left, if interested send an up to date More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
day. You will work with the Lead Data Engineer and other members of the Data Engineering team to deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Development of data ingestion/transformation … pipelines using Fivetran, DBT and Gitlab. Creation of management information dashboards. Work with business analysts and end-users to plan and implement feature enhancements and changes to existing systems, processes and data warehouses. Working with internal staff and third parties (suppliers and partners) to plan and develop new databases, extracts and reports. Assist with the migration from legacy data platforms More ❯
coaching, leading and management skills, able to upskill a small team and help them transition to more technical work. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at least one asset class. More ❯
coaching, leading and management skills, able to upskill a small team and help them transition to more technical work. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at least one asset class. More ❯
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency … making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency … Information Systems, Finance, or a related field. Proven experience as a Data Analyst/Analytics Engineer role, preferably in the payments industry with issuer processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
You'll play a key role in scaling analytics infrastructure, optimizing pipelines, and mentoring fellow engineers. Key Responsibilities: Build and optimize data models across bronze to gold layers using dbt and Kimball methodology Own and manage the semantic layer for BI tools like Looker and Power BI Implement rigorous data quality and testing frameworks Drive CI/CD practices with … like GitHub Actions and Terraform Lead technical decisions and mentor junior engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments More ❯
Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. Ideal Profile More ❯
West London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. Ideal Profile More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. Ideal Profile More ❯
you will be doing Data Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources (e.g., Salesforce, Amplitude, Segment, Google … or in a similar role You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control More ❯
you will be doing Data Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources (e.g., Salesforce, Amplitude, Segment, Google … or in a similar role You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control More ❯
experience in data collection, preprocessing, and integration from various sources, ensuring accuracy, consistency, and handling missing values or outliers. Proficient in designing and implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and data pipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning More ❯