London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
with a London base (flexibility offered) High-impact role with a growing, values-driven data team Platform-focused, mission-led engineering Work with a modern cloud-native stack (Snowflake, DBT, Airflow, Terraform, AWS) What You'll Be Doing Serve as the technical lead for cross-functional data initiatives Define and champion best practices for building scalable, governed, high-quality data … across teams-product managers, analysts, ML engineers, and more What You'll Bring Extensive experience designing and building modern data platforms Strong skills in Python , SQL , and tools like DBT , Airflow , Fivetran Expertise in cloud services (ideally AWS ) and IaC tools like Terraform Deep understanding of data architecture , ELT pipelines, and governance A background in software engineering principles (CI/… both technical and non-technical stakeholders A collaborative mindset and passion for coaching others Tech Environment Cloud : AWS (Kinesis, Lambda, S3, ECS, etc.) Data Warehouse : Snowflake Transformation & Orchestration : Python, DBT, Airflow IaC & DevOps : Terraform, GitHub Actions, Jenkins Monitoring & Governance : Monte Carlo, Collate Interested? If you're excited about platform-level ownership, technical influence, and building systems that help people tell More ❯
reconciliation, and integration verification activities. Core skills and experience: Proven experience designing scalable data architectures in cloud and hybrid environments. Expertise in data modelling, SQL, and platforms like Snowflake, dbt, Power BI, and Databricks. Fluency in Python and knowledge of multiple cloud providers (AWS, Azure, GCP). Understanding of security principles including role-based access control. Experience with legacy-to More ❯
There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯
and optimising data pipelines, enabling analytics and AI, and integrating enterprise systems like CRM, HR, and Finance. Key Responsibilities Design and develop scalable data pipelines using Azure Data Factory, dbt, and Synapse Build enriched datasets for Power BI and AI/ML use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in … data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative, and forward-thinking team Competitive salary And more Please More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices in Havant More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
high-quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact … will work on a variety of projects including the implementation of medallion structures for clients in different industries, data migrations and designing, implementing dashboards using technologies such as python, DBT on Azure and GCP platforms. Key Skills the Senior Data Engineer will have: 3+ years Data Engineering experience Good experience with both Azure and GCP Excellent experience of DBT, SQL … data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to this vacancy. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
You'll play a key role in scaling analytics infrastructure, optimizing pipelines, and mentoring fellow engineers. Key Responsibilities: Build and optimize data models across bronze to gold layers using dbt and Kimball methodology Own and manage the semantic layer for BI tools like Looker and Power BI Implement rigorous data quality and testing frameworks Drive CI/CD practices with … like GitHub Actions and Terraform Lead technical decisions and mentor junior engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments More ❯
experience in data collection, preprocessing, and integration from various sources, ensuring accuracy, consistency, and handling missing values or outliers. Proficient in designing and implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and data pipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM initiatives. … contractor to accelerate delivery of critical pipelines and platform improvements. THE ROLE You'll join a skilled data team to lead the build and optimisation of scalable pipelines using DBT, Airflow, and Databricks. Working alongside data scientists and ML engineers, you'll support everything from raw ingestion to curated layers powering LLMs and advanced analytics.Your responsibilities will include: Building and … maintaining production-grade ETL/ELT workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability, and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
strategy within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … finance, sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting salary (up to £45,000 for the right candidate) Discretionary More ❯
and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You have strong More ❯
Job Title : Data Engineer - Snowflake | DBT | SQL £700/day (Umbrella) London - 3 days per week on-site 6 months (Initial Contract) Are you ready to play a key role in shaping the future of a high-impact enterprise data platform? Join a fast-growing data team working on a greenfield data warehousing project at one of the world's … most respected financial institutions. This is an opportunity to work on cutting-edge data infrastructure using Snowflake, DBT, and Sigma, supporting business-critical data needs and enabling meaningful insights at scale. You'll be part of a collaborative team split across London and Ireland, contributing to a platform that directly supports secure, scalable data sharing and real-time decision-making. … What You'll Be Doing : Model and build robust, scalable data pipelines and warehouse structures using Snowflake and DBT Collaborate with stakeholders to ensure data solutions meet customer and business needs Develop and optimise data models to support real-time analytics and long-term scalability Contribute to platform architecture, data governance, and overall data strategy Use Sigma (or similar BI More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Strong hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail-oriented, self-motivated, and able to communicate complex technical concepts clearly. Experience with media/social platform dataMore ❯
EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands-on approach. More ❯
paced project environments Understanding of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus AWS Redshift DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader … of data governance and compliance (e.g., GDPR) Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus Our Tech Stack AWS Redshift Postgres DBT Power BI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
MJR Analytics
own or as the lead in a small team. Successful candidates will bring experience with Google Cloud, our strategic technology partner, along with modern data stack technologies such as dbt, Looker, Fivetran, Segment and Cube. You'll already have the Google Cloud Data Engineer certification and the Looker developer certification or be committed to obtaining that certification within your first … provided to you using tools such as Google Cloud, Google BigQuery and Google Cloud SQL Transforming and modeling data into consumable information and business logic using tools such as dbt, Dataform and Cube Developing dashboards, explorations and data visualisations using tools such as Looker, Preset, Power BI, Superset Testing data to ensure it is of high quality Using project management … definitions Designing data flows and data models Mentoring: Mentoring and guiding junior analytics and data engineers. Knowledge: Obtaining, if you do not have so already, developer certifications for Looker, dbt, Segment and other modern data stack partner Working towards Google Cloud Data Engineer certification Staying updated with the latest data engineering technologies and methodologies, and implementing them as appropriate. Collaboration More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯