watford, hertfordshire, east anglia, united kingdom
Addition+
their data operations and business intelligence initiatives. Key Responsibilities: Oversee the daily operations of BI data pipelines, reporting platforms, and cloud-based ETL/ELT workflows (AWS, Redshift, Glue, Airflow). Develop and maintain monitoring dashboards to track system health, availability, and operational performance. Manage and resolve Level 2 incidents Conduct root cause analyses and implement corrective actions to … to coordinate system fixes, enhancements, and improvements. Essential Skills & Experience: A minimum of 3 years in AWS-based BI/Data Engineering production support. AWS BI Stack: Redshift, Glue, Airflow, S3, Step Functions. Experience in data modeling, ETL pipelines, and reverse engineering. Proficiency in Power BI (preferred), Business Objects, or Qlik. Strong SQL scripting, optimisation, and troubleshooting capabilities. Previous More ❯
their data operations and business intelligence initiatives. Key Responsibilities: Oversee the daily operations of BI data pipelines, reporting platforms, and cloud-based ETL/ELT workflows (AWS, Redshift, Glue, Airflow). Develop and maintain monitoring dashboards to track system health, availability, and operational performance. Manage and resolve Level 2 incidents Conduct root cause analyses and implement corrective actions to … to coordinate system fixes, enhancements, and improvements. Essential Skills & Experience: A minimum of 3 years in AWS-based BI/Data Engineering production support. AWS BI Stack: Redshift, Glue, Airflow, S3, Step Functions. Experience in data modeling, ETL pipelines, and reverse engineering. Proficiency in Power BI (preferred), Business Objects, or Qlik. Strong SQL scripting, optimisation, and troubleshooting capabilities. Previous More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Adecco
streamline data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership.Key Responsibilities* Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster.* Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS).* CI/CD for Data: Implement and maintain automated pipelines for … Use Terraform and Ansible to provision and manage data infrastructure.* Performance Optimization: Enhance data processing for speed, scalability, and reliability.What We're Looking For* Strong experience with orchestration tools (Airflow, Prefect, Dagster).* Expertise in Docker and Kubernetes.* Solid understanding of CI/CD principles and tooling.* Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL).* Knowledge of … your application with you before presenting it to any potential employer.Please note we are on the client's supplier list for this position.KeywordsLead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability More ❯
Support A/B testing, funnel analysis, and data modelling to enhance performance Contribute to the evolution of the company's data warehouse and pipelines (experience with dbt or Airflow a plus) Collaborate with product, marketing, and commercial teams to translate data into actionable recommendations Communicate insights clearly across teams to influence business outcomes Role Requirements Strong technical skills … in SQL and dashboarding tools (Looker Studio/BigQuery) Experience with A/B testing, funnel analysis, and data modelling Familiarity with data warehouse concepts and pipeline development (dbt, Airflow experience advantageous) Ability to work collaboratively across multiple teams and communicate insights effectively Proactive, detail-oriented, and able to drive impact in a high-growth environment The Company You More ❯
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership. Key Responsibilities * Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster. * Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS). * CI/CD for Data: Implement and maintain automated pipelines for … Terraform and Ansible to provision and manage data infrastructure. * Performance Optimization: Enhance data processing for speed, scalability, and reliability. What We're Looking For * Strong experience with orchestration tools (Airflow, Prefect, Dagster). * Expertise in Docker and Kubernetes. * Solid understanding of CI/CD principles and tooling. * Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL). * Knowledge of More ❯
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership. Key Responsibilities * Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster. * Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS). * CI/CD for Data: Implement and maintain automated pipelines for … Terraform and Ansible to provision and manage data infrastructure. * Performance Optimization: Enhance data processing for speed, scalability, and reliability. What We're Looking For * Strong experience with orchestration tools (Airflow, Prefect, Dagster). * Expertise in Docker and Kubernetes. * Solid understanding of CI/CD principles and tooling. * Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL). * Knowledge of More ❯