data models. Experience with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: We’re a business with a global reach that empowers local teams, and we More ❯
The team you'll be working with: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in the full data modeling lifecycle, including designing, implementing, and More ❯
reconciliation, and integration verification activities. Core skills and experience: Proven experience designing scalable data architectures in cloud and hybrid environments. Expertise in data modelling, SQL, and platforms like Snowflake, dbt, Power BI, and Databricks. Fluency in Python and knowledge of multiple cloud providers (AWS, Azure, GCP). Understanding of security principles including role-based access control. Experience with legacy-to More ❯
data platform evolution Has experience (or strong interest) in building real-time or event-driven architectures ️ Modern Data Stack Includes: Python , SQL Snowflake , Postgres AWS (S3, ECS, Terraform) Airflow , dbt , Docker Apache Spark , Iceberg What they're looking for: Solid experience as a Senior/Lead/Principal Data Engineer, ideally with some line management or mentoring Proven ability to More ❯
london, south east england, united kingdom Hybrid / WFH Options
Count
reliable data-focused backend services. Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s), and data pipelines (SQL, dbt, Airbyte). Love automation, process improvement, and finding ways to help others work efficiently. Are comfortable working autonomously and taking responsibility for large technical projects. Are eager to learn from More ❯
london, south east england, united kingdom Hybrid / WFH Options
Count Technologies Ltd
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯
london, south east england, united kingdom Hybrid / WFH Options
Zego
and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through mentorship, feedback, and knowledge sharing. Pragmatic Problem More ❯
and optimising data pipelines, enabling analytics and AI, and integrating enterprise systems like CRM, HR, and Finance. Key Responsibilities Design and develop scalable data pipelines using Azure Data Factory, dbt, and Synapse Build enriched datasets for Power BI and AI/ML use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in … data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative, and forward-thinking team Competitive salary And more Please More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices in Havant More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
high-quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact … will work on a variety of projects including the implementation of medallion structures for clients in different industries, data migrations and designing, implementing dashboards using technologies such as python, DBT on Azure and GCP platforms. Key Skills the Senior Data Engineer will have: 3+ years Data Engineering experience Good experience with both Azure and GCP Excellent experience of DBT, SQL … data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to this vacancy. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
You'll play a key role in scaling analytics infrastructure, optimizing pipelines, and mentoring fellow engineers. Key Responsibilities: Build and optimize data models across bronze to gold layers using dbt and Kimball methodology Own and manage the semantic layer for BI tools like Looker and Power BI Implement rigorous data quality and testing frameworks Drive CI/CD practices with … like GitHub Actions and Terraform Lead technical decisions and mentor junior engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments More ❯
of data from any source — whether databases, applications, or files — into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, we’re focused on making robust data pipelines accessible to everyone. London (in person) At Etleap, we’re redefining how data teams build … of data from any source — whether databases, applications, or files — into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, we’re focused on making robust data pipelines accessible to everyone. We are looking to add senior engineers to our core engineering team More ❯
experience in data collection, preprocessing, and integration from various sources, ensuring accuracy, consistency, and handling missing values or outliers. Proficient in designing and implementing ELT pipelines using tools like dbt, with strong knowledge of data warehousing, data lake concepts, and data pipeline optimization. Skilled in SQL for data manipulation, analysis, query optimisation, and database design. Artificial Intelligence and Machine Learning More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM initiatives. … contractor to accelerate delivery of critical pipelines and platform improvements. THE ROLE You'll join a skilled data team to lead the build and optimisation of scalable pipelines using DBT, Airflow, and Databricks. Working alongside data scientists and ML engineers, you'll support everything from raw ingestion to curated layers powering LLMs and advanced analytics.Your responsibilities will include: Building and … maintaining production-grade ETL/ELT workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability, and More ❯
ensuring quality, cataloging, and discoverability. Enforce data compliance standards, including PII masking, data obfuscation, and RBAC. Oversee the design, development, deployment, and maintenance of data products, including dashboards, reports, dbt models, and machine learning models. Ensure the quality, accuracy, and reliability of data analysis and modelling. Collaborate with the Software Engineering team to build and maintain a robust data infrastructure. More ❯
Azure SQL Data Warehouse), Azure Databricks. Experience with Python and Azure Functions to glue together data infrastructure. Strong experience in software engineering fundamentals including testing (e.g. Azure Test Plans, dbt), Git version control, agile development and sharing knowledge via code review, mentoring and documentation (e.g. Confluence, DevOps wiki). It would also be great if you are/have: Microsoft More ❯
Proven ability to design and develop scalable data pipelines Strong collaboration skills with business stakeholders and cross-functional teams Experience with infrastructure as code (ideally Terraform) Desirable: Experience using dbt and implementing dbt best practices Overview: Job Title: Senior Data Engineer – Data Modelling & Engineering Location: London area Duration: 6 months Why Apply? This is a fantastic opportunity to contribute to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
strategy within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … finance, sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
data pipelines and systems. Expertise in cloud technologies (AWS) and integrating cloud services into data platforms. Strong proficiency in SQL, Python, and modern data engineering tools such as Snowflake, DBT, and Tableau. Experience with data warehousing, data lakes, and data modelling. In-depth knowledge of best practices for data quality, security, and performance optimisation. Ability to work closely with both More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting salary (up to £45,000 for the right candidate) Discretionary More ❯