London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the data platform.* Work with orchestration tools such as Airflow, ADF, or Prefect to schedule and automate workflows.* Keep abreast of industry trends and emerging technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum 3 years' experience working as More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership.Key Responsibilities* Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster.* Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS).* CI/CD for Data: Implement and maintain automated pipelines for data applications. … Terraform and Ansible to provision and manage data infrastructure.* Performance Optimization: Enhance data processing for speed, scalability, and reliability.What We're Looking For* Strong experience with orchestration tools (Airflow, Prefect, Dagster).* Expertise in Docker and Kubernetes.* Solid understanding of CI/CD principles and tooling.* Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL).* Knowledge of Infrastructure as … application with you before presenting it to any potential employer.Please note we are on the client's supplier list for this position.KeywordsLead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability, Monitoring. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
needed). The Role Build and optimise scalable data pipelines for a new Data Lake Work with Python, SQL, Spark, AWS, Docker, CI/CD Orchestrate workflows using Airflow, Prefect, or Dagster Support ETL, API integrations, and high-quality data validation Translate business needs into technical solutions with full ownership What You Need STEM degree (Computer Science, Engineering, Maths, etc. … years in data engineering/analytics/software Strong Python + SQL 2+ years ETL, APIs, CI/CD, Docker, AWS Ideally but not essential experience with Airflow/Prefect/Dagster Background in Financial Services, FinTech, Insurance, PE/VC, or Banking Package & Benefits High performance bonus 25 days holiday Pension & private healthcare Join at inception - shape the team More ❯