London, South East, England, United Kingdom Hybrid / WFH Options
Adecco
streamline data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership.Key Responsibilities* Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster.* Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS).* CI/CD for Data: Implement and maintain automated pipelines for … Use Terraform and Ansible to provision and manage data infrastructure.* Performance Optimization: Enhance data processing for speed, scalability, and reliability.What We're Looking For* Strong experience with orchestration tools (Airflow, Prefect, Dagster).* Expertise in Docker and Kubernetes.* Solid understanding of CI/CD principles and tooling.* Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL).* Knowledge of … your application with you before presenting it to any potential employer.Please note we are on the client's supplier list for this position.KeywordsLead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability More ❯
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership. Key Responsibilities * Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster. * Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS). * CI/CD for Data: Implement and maintain automated pipelines for … Terraform and Ansible to provision and manage data infrastructure. * Performance Optimization: Enhance data processing for speed, scalability, and reliability. What We're Looking For * Strong experience with orchestration tools (Airflow, Prefect, Dagster). * Expertise in Docker and Kubernetes. * Solid understanding of CI/CD principles and tooling. * Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL). * Knowledge of … you before presenting it to any potential employer. Please note we are on the client's supplier list for this position. Keywords Lead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability More ❯
run seamlessly within an AWS environment. Day to day, you’ll: Oversee end-to-end data operations, ensuring all loads complete successfully and identifying and resolving workflow issues within Airflow Manage and triage incident tickets, investigating discrepancies and implementing fixes to maintain data accuracy and performance Drive automation initiatives, helping to reduce manual intervention and improve system reliability Collaborate … continuous improvement within a large-scale data transformation programme Your Skills and Experience 5+ years’ experience across Data Engineering, BI or Data Operations Strong hands-on experience with AWS , Airflow , and Power BI Solid understanding of data pipelines, ETL, and debugging within cloud environments Strong problem-solving and stakeholder management skills Experience managing or working with offshore teams A More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harnham
run seamlessly within an AWS environment. Day to day, you’ll: Oversee end-to-end data operations, ensuring all loads complete successfully and identifying and resolving workflow issues within Airflow Manage and triage incident tickets, investigating discrepancies and implementing fixes to maintain data accuracy and performance Drive automation initiatives, helping to reduce manual intervention and improve system reliability Collaborate … continuous improvement within a large-scale data transformation programme Your Skills and Experience 5+ years’ experience across Data Engineering, BI or Data Operations Strong hands-on experience with AWS , Airflow , and Power BI Solid understanding of data pipelines, ETL, and debugging within cloud environments Strong problem-solving and stakeholder management skills Experience managing or working with offshore teams A More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
run seamlessly within an AWS environment. Day to day, you'll: Oversee end-to-end data operations, ensuring all loads complete successfully and identifying and resolving workflow issues within Airflow Manage and triage incident tickets, investigating discrepancies and implementing fixes to maintain data accuracy and performance Drive automation initiatives, helping to reduce manual intervention and improve system reliability Collaborate … continuous improvement within a large-scale data transformation programme Your Skills and Experience 5+ years' experience across Data Engineering, BI or Data Operations Strong hands-on experience with AWS , Airflow , and Power BI Solid understanding of data pipelines, ETL, and debugging within cloud environments Strong problem-solving and stakeholder management skills Experience managing or working with offshore teams A More ❯
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership. Key Responsibilities * Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster. * Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS). * CI/CD for Data: Implement and maintain automated pipelines for … Terraform and Ansible to provision and manage data infrastructure. * Performance Optimization: Enhance data processing for speed, scalability, and reliability. What We're Looking For * Strong experience with orchestration tools (Airflow, Prefect, Dagster). * Expertise in Docker and Kubernetes. * Solid understanding of CI/CD principles and tooling. * Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL). * Knowledge of More ❯