techniques You might also have: Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as ApacheAirflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Experience with ServiceNow integration More ❯
with experience building pipelines and ETL processes. Hands-on experience with GCP (BigQuery, Cloud Functions, Dataflow). Knowledge of CRM systems and customer data flows. Familiarity with ETL frameworks (Airflow) and version control (Git/CI/CD). Experience working cross-functionally with Data Analysts, CRM, and operational teams. Desirable (not essential!): GCP certification (e.g. Professional Data Engineer More ❯
with experience building pipelines and ETL processes. Hands-on experience with GCP (BigQuery, Cloud Functions, Dataflow). Knowledge of CRM systems and customer data flows. Familiarity with ETL frameworks (Airflow) and version control (Git/CI/CD). Experience working cross-functionally with Data Analysts, CRM, and operational teams. Desirable (not essential!): GCP certification (e.g. Professional Data Engineer More ❯
with experience building pipelines and ETL processes. Hands-on experience with GCP (BigQuery, Cloud Functions, Dataflow). Knowledge of CRM systems and customer data flows. Familiarity with ETL frameworks (Airflow) and version control (Git/CI/CD). Experience working cross-functionally with Data Analysts, CRM, and operational teams. Desirable (not essential!): GCP certification (e.g. Professional Data Engineer More ❯
Cloud Platform (GCP) and cloud data tools Background in CRM systems and customer data structures Understanding of data warehousing concepts and cloud architecture Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline Data Insights reporting experience Competent with real-time data processing and streaming technologies Proficiency in Tableau or other data visualisation tools is desirable ** PLEASE More ❯
Cloud Platform (GCP) and cloud data tools * Background in CRM systems and customer data structures * Understanding of data warehousing concepts and cloud architecture * Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline * Data Insights reporting experience * Competent with real-time data processing and streaming technologies * Proficiency in Tableau or other data visualisation tools is desirable ** PLEASE More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
the role you will have: Creating scalable ETL jobs using Scala and Spark Strong understanding of data structures, algorithms, and distributed systems Experience working with orchestration tools such as Airflow Familiarity with cloud technologies (AWS or GCP) Hands-on experience with Gen AI tools for coding and debugging This is a remote-first role with flexibility to work from More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices in testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For More ❯
Bonus skills Experience in any of the following is a plus: Python, Scala, or data processing frameworks Event-driven or distributed system architectures Big data tools (e.g., Hadoop, HDFS, Airflow) Linux and containerisation Why join? This is an exciting opportunity to be at the forefront of large-scale, cloud-first transformation efforts. You'll help shape platforms that power More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment Ltd
Bonus skills Experience in any of the following is a plus: Python, Scala, or data processing frameworks Event-driven or distributed system architectures Big data tools (e.g., Hadoop, HDFS, Airflow) Linux and containerisation Why join? This is an exciting opportunity to be at the forefront of large-scale, cloud-first transformation efforts. You'll help shape platforms that power More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
our platform. Take ownership of specific ML features or components and drive them from concept to production and iteration. Work with leading tools such as GCP Vertex AI, BigQuery, Airflow, and emerging ML technologies. Be a part of a friendly, diverse, innovative, international team and workplace that encourages learning and growth. Who you are: Experience working in a Data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
datasets into actionable insights for the business. In this role, you will: Develop and maintain a cloud-based data warehouse (BigQuery, GCP) Create and optimise ETL processes (SSIS, Talend, Airflow) Collaborate with BI teams to deliver key performance metrics Ensure data quality, security, and cost-efficient storage What you'll bring: Strong SQL and data modelling skills Data warehouse More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
McGregor Boyall
systems. Deploy scalable AI systems using AWS (Lambda, S3, SQS, EKS/ECS), CDK , and modern DevOps practices. Collaborate on infrastructure and data pipelines using SQLAlchemy, Boto3, Pandas, and Airflow . Contribute to real-time AI services , model versioning, and advanced fine-tuning (LoRA, QLoRA, etc.). AI Engineer requirements: Solid Python skills (3.9+) with deep knowledge of async More ❯
Background in high-volume, distributed processing systems. Familiarity with DevOps tools such as GitHub Actions or Jenkins. Solid grounding in modern engineering principles and full-stack development. Bonus Skills: Airflow, Kafka/Kafka Connect, Delta Lake, JSON/XML/Parquet/YAML, cloud-based data services. Why Apply? Work for a global payments innovator shaping the future of More ❯
My client within Venture capital is looking for a data engineer to join their team.The role will be working maintaining & modernising data pipelines. Requirements Python Azure, CI/CD Airflow, DBT Docker GitHub, Azure DevOps LLM Experience desirable Contract: 12 Months Rate: £500-600 Via Umbrella Location: London - 3 days per week in the office. If this role is More ❯
ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Experience in ELT processes and best practices for cloud-based data warehousing. Knowledge of performance tuning techniques for optimising BigQuery queries and costs. Familiarity with cloud services (GCP, Terraform, Airflow, etc.) and their integration with BigQuery. HOW TO APPLY Please register your interest by sending your CV via the apply link on this page. More ❯
sunbury, south east england, united kingdom Hybrid / WFH Options
bp
focus on practical application and performance trade-offs. Experience with cloud platforms (e.g., AWS, Azure) and containerisation (Docker, Kubernetes). Familiarity with MLOps practices and tools (e.g., MLflow, SageMaker, Airflow). Experience working with large-scale datasets and distributed computing frameworks (e.g., Spark). Strong communication skills and ability to work collaboratively in a team environment. MSc or PhD More ❯
focus on practical application and performance trade-offs. Experience with cloud platforms (e.g., AWS, Azure) and containerisation (Docker, Kubernetes). Familiarity with MLOps practices and tools (e.g., MLflow, SageMaker, Airflow). Experience working with large-scale datasets and distributed computing frameworks (e.g., Spark). Strong communication skills and ability to work collaboratively in a team environment. MSc or PhD More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
BP Energy
focus on practical application and performance trade-offs. Experience with cloud platforms (e.g., AWS, Azure) and containerisation (Docker, Kubernetes). Familiarity with MLOps practices and tools (e.g., MLflow, SageMaker, Airflow). Experience working with large-scale datasets and distributed computing frameworks (e.g., Spark). Strong communication skills and ability to work collaboratively in a team environment. MSc or PhD More ❯
optimisation. This is an ideal role for someone looking to hit the ground running, work on complex challenges with autonomy. Role Requirements: Exceptional ability with tools such as Python, Airflow, SQL, and at least one Cloud provider Experience forecasting, customer, and propensity models Experience with building machine learning models and deploying at scale 2:1 or above in Mathematics More ❯
optimisation. This is an ideal role for someone looking to hit the ground running, work on complex challenges with autonomy. Role Requirements: Exceptional ability with tools such as Python, Airflow, SQL, and at least one Cloud provider Experience forecasting, customer, and propensity models Experience with building machine learning models and deploying at scale 2:1 or above in Mathematics More ❯
london (city of london), south east england, united kingdom
JSS Search
optimisation. This is an ideal role for someone looking to hit the ground running, work on complex challenges with autonomy. Role Requirements: Exceptional ability with tools such as Python, Airflow, SQL, and at least one Cloud provider Experience forecasting, customer, and propensity models Experience with building machine learning models and deploying at scale 2:1 or above in Mathematics More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯