London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
a London base (flexibility offered) High-impact role with a growing, values-driven data team Platform-focused, mission-led engineering Work with a modern cloud-native stack (Snowflake, DBT, Airflow, Terraform, AWS) What You'll Be Doing Serve as the technical lead for cross-functional data initiatives Define and champion best practices for building scalable, governed, high-quality data … teams-product managers, analysts, ML engineers, and more What You'll Bring Extensive experience designing and building modern data platforms Strong skills in Python , SQL , and tools like DBT , Airflow , Fivetran Expertise in cloud services (ideally AWS ) and IaC tools like Terraform Deep understanding of data architecture , ELT pipelines, and governance A background in software engineering principles (CI/… technical and non-technical stakeholders A collaborative mindset and passion for coaching others Tech Environment Cloud : AWS (Kinesis, Lambda, S3, ECS, etc.) Data Warehouse : Snowflake Transformation & Orchestration : Python, DBT, Airflow IaC & DevOps : Terraform, GitHub Actions, Jenkins Monitoring & Governance : Monte Carlo, Collate Interested? If you're excited about platform-level ownership, technical influence, and building systems that help people tell More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
office) Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real … Excellent experience of DBT, SQL and Python Good customer-facing/pitching experience and being a self-sufficient person A proactive mindset with excellent problem-solving skills Experience of airflow and medallion is desirable A degree in computer science or a related degree is beneficial Benefits: Company bonus scheme (Based in annual profit made by the company) Pension … with data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
and evaluation through continuous monitoring and scaling. Build & Optimise AI models in Python: fine-tune state-of-the-art architectures on our in-house GPU cluster. Orchestrate Workflows with ApacheAirflow: schedule, monitor, and maintain complex data and model pipelines. Engineer Cloud Services on AWS (Lambda, ECS/EKS, S3, Redshift, etc.) and automate deployments using GitHub Actions … testing, and monitoring. Startup mindset: proactive, resourceful, ambitious, driven to innovate, eager to learn, and comfortable wearing multiple hats in a fast-moving environment. Desirable: hands-on experience with ApacheAirflow, AWS services (especially Redshift, S3, ECS/EKS), and IaC tools like Pulumi. Why Permutable AI? Hybrid Flexibility: Spend 2+ days/week in our Vauxhall hub. More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
Burns Sheehan
Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML We are partnered with a private equity backed company who provide an AI-powered, guided selling platform that helps businesses improve online sales and customer experience. They are looking for a Lead Data Engineer to lead a small team … experience in a Senior Data Engineering role. Comfortable owning and delivering technical projects end-to-end. Strong in Python, SQL, and cloud platforms (AWS or comparable). Experience with Airflow, Snowflake, Docker (or similar). Familiarity with coaching and mentoring more junior engineers, leading 1-1s and check ins. Wider tech stack : AWS, Python, Airflow, Fivetran, Snowflake … Enhanced parental leave and pay If you are interested in finding out more, please apply or contact me directly! Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
ll be involved in designing and building production-grade ETL pipelines, driving DevOps practices across data systems and contributing to high-availability architectures using tools like Databricks, Spark and Airflow- all within a modern AWS ecosystem. Responsibilities Architect and build scalable, secure data pipelines using AWS, Databricks and PySpark. Design and implement robust ETL/ELT solutions for both … structured and unstructured data. Automate workflows and orchestrate jobs using Airflow and GitHub Actions. Integrate data with third-party APIs to support real-time marketing insights. Collaborate closely with cross-functional teams including Data Science, Software Engineering and Product. Champion best practices in data governance, observability and compliance. Contribute to CI/CD pipeline development and infrastructure automation (Terraform More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments Proven experience as an Analytics Engineer Nice to Have: Experience More ❯
PyTorch, TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
AND EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices in testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and familiarity with Git Strong communicator, eager to learn, and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
Cloud usage VMWare usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused: Data Pipeline Orchestration and ELT tooling such as ApacheAirflow, Apache NiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3, MinIO, LakeFS CI/CD More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯