role Desirable Experience Hands-on experience with APIs from major SaaS platforms (e.g., Office 365, Salesforce, Workday, Oracle, SAP) Familiarity with our core data stack: DuckDB, Dagster, Postgres, Kafka, DBT, EKS, and Databricks Understanding of Identity and Access Management (IAM) concepts and APIs from providers like Okta, Entra ID, Ping Exposure to AI-enhanced low-code tools like Microsoft Copilot More ❯
Modelling Be deploying applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with More ❯
Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact … will work on a variety of projects including the implementation of medallion structures for clients in different industries, data migrations and designing, implementing dashboards using technologies such as python, DBT on Azure and GCP platforms. Key Skills the Senior Data Engineer will have: 3+ years Data Engineering experience Good experience with both Azure and GCP Excellent experience of DBT, SQL … data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to this vacancy. More ❯
Experience Strong SQL and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
day. You will work with the Lead Data Engineer and other members of the Data Engineering team to deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Development of data ingestion/transformation … pipelines using Fivetran, DBT and Gitlab. Creation of management information dashboards. Work with business analysts and end-users to plan and implement feature enhancements and changes to existing systems, processes and data warehouses. Working with internal staff and third parties (suppliers and partners) to plan and develop new databases, extracts and reports. Assist with the migration from legacy data platforms More ❯
coaching, leading and management skills, able to upskill a small team and help them transition to more technical work. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at least one asset class. More ❯
coaching, leading and management skills, able to upskill a small team and help them transition to more technical work. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at least one asset class. More ❯
Senior Data Engineer (3-Month Contract) Trailmix is an award-winning studio based in the heart of London, backed by mobile giants Supercell. We are actively growing the talented team behind our hit game Love & Pies , and setting up new More ❯
culture of curiosity, inclusion and experimentation, with an eye on emerging AI opportunities. What you'll need Strong technical know-how, particularly in Python, SQL, modern data stacks (Snowflake, DBT, Spark, Airflow) and cloud platforms like AWS. Experience leading high-performing engineering teams with empathy and clarity. A knack for building scalable data solutions and making strategic tech decisions without More ❯
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency … making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency … Information Systems, Finance, or a related field. Proven experience as a Data Analyst/Analytics Engineer role, preferably in the payments industry with issuer processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
You'll play a key role in scaling analytics infrastructure, optimizing pipelines, and mentoring fellow engineers. Key Responsibilities: Build and optimize data models across bronze to gold layers using dbt and Kimball methodology Own and manage the semantic layer for BI tools like Looker and Power BI Implement rigorous data quality and testing frameworks Drive CI/CD practices with … like GitHub Actions and Terraform Lead technical decisions and mentor junior engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments More ❯
West London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. Ideal Profile More ❯
you will be doing Data Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources (e.g., Salesforce, Amplitude, Segment, Google … or in a similar role You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control More ❯
you will be doing Data Modelling & Engineering: Gather requirements from stakeholders and design scalable data solutions Build and maintain robust data models and exposures in the data warehouse using dbt, Snowflake, and Looker Document architectural decisions, modelling challenges, and outcomes in a clear and structured way Wrangle and integrate data from multiple third-party sources (e.g., Salesforce, Amplitude, Segment, Google … or in a similar role You will have proven experience in designing and maintaining data models for warehousing and business intelligence You will have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control More ❯
experience in data collection, preprocessing, and integration from various sources, ensuring accuracy, consistency, and handling missing values or outliers. Proficient in designing and implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and data pipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning More ❯
ready datasets. Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks expert. Essential Skills & Experience: Demonstrable … security, and compliance, including Unity Catalog. Excellent communication, leadership, and problem-solving skills. Desirable: Databricks certifications (e.g., Data Engineer Associate/Professional or Solutions Architect). Familiarity with MLflow, dbt, and BI tools such as Power BI or Tableau. Exposure to MLOps practices and deploying ML models within Databricks. Experience working within Agile and DevOps-driven delivery environments. More ❯
pragmatic approach to how we apply new technologies. Our Tech Stack TypeScript (Full-stack) React + Next.js, Tailwind, Prisma, tRPC PostgreSQL, MongoDB, Redis Serverless, AWS, Google Cloud, Github Actions DBT, BigQuery Terraform Python Requirements You're opinionated and want to help us change the legal system for the better You have a track record of delivering exceptional work and can More ❯