West Midlands, United Kingdom Hybrid / WFH Options
Experis
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using ApacheAirflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, ApacheAirflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
including LoRA, QLoRA, and parameter-efficient methods Multi-modal AI systems combining text, image, and structured data Reinforcement Learning from Human Feedback (RLHF) for model alignment Production ML Systems: ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management Model versioning and experiment tracking (MLflow, Weights & Biases) Real-time model serving and edge deployment strategies A More ❯
with Kimball methodology, dimensional modelling, and star schema design. Proven experience with Redshift or Snowflake. Strong background in cloud-based data environments (AWS preferred). Hands-on experience with Airflow for orchestration. (Nice-to-have) Python for data engineering tasks. (Nice-to-have) Optimisation for BI tools such as Power BI or Looker. Soft skills: Strong collaboration with both More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
e.g., Git), and strong problem-solving skills are vital. The ability to communicate technical concepts clearly to non-technical stakeholders is also important. Experience with orchestration tools such as Airflow, knowledge of Python or other scripting languages, and familiarity with data governance and security best practices are desirable. Exposure to agile development environments, performance tuning of data queries, and More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
McGregor Boyall
Transformers , PyTorch , YOLO , OpenCV , and Pillow Designing and deploying AI services via FastAPI , AWS Lambda , EKS/ECS , and S3 Creating scalable data pipelines with Pandas , NumPy , SQLAlchemy , and Airflow Supporting real-time model inference, edge deployments, and API integrations Working closely with a technical lead in a service-based and serverless architecture Essential Python Engineer skills and experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
ideally within consumer tech, logistics, or e-commerce. Strong proficiency in Python, SQL, and machine learning frameworks. Experience with cloud platforms (Azure, AWS, GCP) and tools like Databricks, DBT, Airflow, or Terraform. Familiarity with AI/ML applications and modern analytics tooling. Excellent communication skills and ability to work independently in a fast-paced environment. Why Join? Be part More ❯
Cloud Platform (GCP) and cloud data tools Background in CRM systems and customer data structures Understanding of data warehousing concepts and cloud architecture Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline Data Insights reporting experience Competent with real-time data processing and streaming technologies Proficiency in Tableau or other data visualisation tools is desirable ** PLEASE More ❯
Cloud Platform (GCP) and cloud data tools * Background in CRM systems and customer data structures * Understanding of data warehousing concepts and cloud architecture * Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline * Data Insights reporting experience * Competent with real-time data processing and streaming technologies * Proficiency in Tableau or other data visualisation tools is desirable ** PLEASE More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
McGregor Boyall
systems. Deploy scalable AI systems using AWS (Lambda, S3, SQS, EKS/ECS), CDK , and modern DevOps practices. Collaborate on infrastructure and data pipelines using SQLAlchemy, Boto3, Pandas, and Airflow . Contribute to real-time AI services , model versioning, and advanced fine-tuning (LoRA, QLoRA, etc.). AI Engineer requirements: Solid Python skills (3.9+) with deep knowledge of async More ❯
Background in high-volume, distributed processing systems. Familiarity with DevOps tools such as GitHub Actions or Jenkins. Solid grounding in modern engineering principles and full-stack development. Bonus Skills: Airflow, Kafka/Kafka Connect, Delta Lake, JSON/XML/Parquet/YAML, cloud-based data services. Why Apply? Work for a global payments innovator shaping the future of More ❯
My client within Venture capital is looking for a data engineer to join their team.The role will be working maintaining & modernising data pipelines. Requirements Python Azure, CI/CD Airflow, DBT Docker GitHub, Azure DevOps LLM Experience desirable Contract: 12 Months Rate: £500-600 Via Umbrella Location: London - 3 days per week in the office. If this role is More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Experience in ELT processes and best practices for cloud-based data warehousing. Knowledge of performance tuning techniques for optimising BigQuery queries and costs. Familiarity with cloud services (GCP, Terraform, Airflow, etc.) and their integration with BigQuery. HOW TO APPLY Please register your interest by sending your CV via the apply link on this page. More ❯
Telford, Shropshire, West Midlands, United Kingdom
LA International Computer Consultants Ltd
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. Due to the nature and urgency of this post, candidates holding or who have held high level More ❯
in Python, ensuring scalability and reliability. Extract data from multiple external sources via APIs, and where necessary, web scraping/browser automation (Playwright, Selenium, Puppeteer). Orchestrate pipelines using Airflow, and manage data quality workflows. Model and transform data in SQL and Snowflake to create clean, analytics-ready datasets. Ensure data quality, observability, and governance across workflows. Collaborate closely … who bring: Strong hands-on experience with Python for API ingestion, pipeline automation, and data transformation. Solid SQL skills with Snowflake (or similar cloud data warehouses). Experience with Airflow or other orchestration tools. Knowledge of data modelling, warehouse performance optimisation, and governance. Cloud experience (AWS preferred; Terraform/Docker a plus). Nice-to-have: browser automation/ More ❯
Data Engineer (Snowflake & Airflow) London/Edinburgh 6 Month Contract Via Umbrella Our UK leading banking client are looking for a Data Engineer to join their team on an initial 6-month contract. Key Skills: Snowflake Agile methodology and working within small multiskilled feature teams. Comfortable with GIT and version control. Knowledge of older coding standards and practices. AirflowMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
methodology and a Medallion architecture (bronze, silver, gold layers). Develop and maintain DBT projects and configure incremental loads with built-in unit testing. Support data pipeline orchestration with Airflow and work with AWS cloud tools. Help deliver a production-ready Data Mart with star schema design to power business reporting and dashboards (PowerBI experience a plus). Skills … Experience: Strong SQL expertise and hands-on experience with DBT. Familiarity with Kimball dimensional modelling concepts. Experience working with cloud data warehouses such as Redshift or Snowflake. Knowledge of Airflow for workflow management. Comfortable in AWS environments and data orchestration. Bonus: Python programming skills and familiarity with dashboarding tools. Contract Details: Duration: 3 months Rate: £450/day onside More ❯