Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
data engineering capabilities. Looking at our current pipeline of work, we can also consider those with an Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial. A bit about YOU! As much as we just love working with great, fun people, there are some obvious required Skills … Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with other cloud platforms (e.g. More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB • Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) • Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
deliver value-focused data solutions We'd love to talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
with a London base (flexibility offered) High-impact role with a growing, values-driven data team Platform-focused, mission-led engineering Work with a modern cloud-native stack (Snowflake, DBT, Airflow, Terraform, AWS) What You'll Be Doing Serve as the technical lead for cross-functional data initiatives Define and champion best practices for building scalable, governed, high-quality data … across teams-product managers, analysts, ML engineers, and more What You'll Bring Extensive experience designing and building modern data platforms Strong skills in Python , SQL , and tools like DBT , Airflow , Fivetran Expertise in cloud services (ideally AWS ) and IaC tools like Terraform Deep understanding of data architecture , ELT pipelines, and governance A background in software engineering principles (CI/… both technical and non-technical stakeholders A collaborative mindset and passion for coaching others Tech Environment Cloud : AWS (Kinesis, Lambda, S3, ECS, etc.) Data Warehouse : Snowflake Transformation & Orchestration : Python, DBT, Airflow IaC & DevOps : Terraform, GitHub Actions, Jenkins Monitoring & Governance : Monte Carlo, Collate Interested? If you're excited about platform-level ownership, technical influence, and building systems that help people tell More ❯
of experience in data engineering or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least More ❯
reconciliation, and integration verification activities. Core skills and experience: Proven experience designing scalable data architectures in cloud and hybrid environments. Expertise in data modelling, SQL, and platforms like Snowflake, dbt, Power BI, and Databricks. Fluency in Python and knowledge of multiple cloud providers (AWS, Azure, GCP). Understanding of security principles including role-based access control. Experience with legacy-to More ❯
love to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯
Luton, Bedfordshire, United Kingdom Hybrid / WFH Options
OAG Aviation Worldwide Limited
A BOUT THE COMPANY: OAG is a leading data platform for the global travel industry offering an industry-first single source for supply, demand, and pricing data. We empower the global travel industry with high-quality, relevant datasets covering the More ❯
and optimising data pipelines, enabling analytics and AI, and integrating enterprise systems like CRM, HR, and Finance. Key Responsibilities Design and develop scalable data pipelines using Azure Data Factory, dbt, and Synapse Build enriched datasets for Power BI and AI/ML use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in … data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative, and forward-thinking team Competitive salary And more Please More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices in Havant More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
high-quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
role Desirable Experience Hands-on experience with APIs from major SaaS platforms (e.g., Office 365, Salesforce, Workday, Oracle, SAP) Familiarity with our core data stack: DuckDB, Dagster, Postgres, Kafka, DBT, EKS, and Databricks Understanding of Identity and Access Management (IAM) concepts and APIs from providers like Okta, Entra ID, Ping Exposure to AI-enhanced low-code tools like Microsoft Copilot More ❯
GenAI tools (e.g., Cursor, Gemini, Claude) into development workflows. Curate and manage datasets across structured and unstructured formats and diverse domains. Contribute to metadata enrichment, lineage, and discoverability using DBT, Airflow, and internal tooling. Skills & Experience We value both traditional and non-traditional career paths. You'll ideally bring: Technical Skills 3-5 years of experience in data or analytics More ❯
Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact … will work on a variety of projects including the implementation of medallion structures for clients in different industries, data migrations and designing, implementing dashboards using technologies such as python, DBT on Azure and GCP platforms. Key Skills the Senior Data Engineer will have: 3+ years Data Engineering experience Good experience with both Azure and GCP Excellent experience of DBT, SQL … data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to this vacancy. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home