their data operations and business intelligence initiatives. Key Responsibilities: Oversee the daily operations of BI data pipelines, reporting platforms, and cloud-based ETL/ELT workflows (AWS, Redshift, Glue, Airflow). Develop and maintain monitoring dashboards to track system health, availability, and operational performance. Manage and resolve Level 2 incidents Conduct root cause analyses and implement corrective actions to … to coordinate system fixes, enhancements, and improvements. Essential Skills & Experience: A minimum of 3 years in AWS-based BI/Data Engineering production support. AWS BI Stack: Redshift, Glue, Airflow, S3, Step Functions. Experience in data modeling, ETL pipelines, and reverse engineering. Proficiency in Power BI (preferred), Business Objects, or Qlik. Strong SQL scripting, optimisation, and troubleshooting capabilities. Previous More ❯
Karlsruhe, Baden-Württemberg, Germany Hybrid / WFH Options
Cinemo GmbH
.000 € per year Requirements: Minimum 1 to 2 years of proven experience in ML-Ops, including end-to-end machine learning lifecycle management Familiarity with MLOps tools like MLFlow, Airflow, Kubeflow or custom implemented solutions. Experience designing and managing CI/CD pipelines for machine learning projects with experience in CI/CD tools (e.g., Github actions, Bitbucket Pipelines … workflows Automate repetitive and manual processes involved in machine learning operations to improve efficiency Implement and manage in-cloud ML-Ops solutions, leveraging Terraform for infrastructure as code Technologies: Airflow AWS BitBucket CI/CD Cloud Embedded GitHub Support Kubeflow Machine Learning Mobile Python Terraform C++ DevOps More: Cinemo is a global provider of highly innovative infotainment products that More ❯
Job Title: SQL Developer Location: Miami, FL - Onsite Duration: 1 year + renewable We are looking for a talented SQL Developer who is passionate about data in all its forms. In this role, you will be part of a talented More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Adecco
streamline data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership.Key Responsibilities* Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster.* Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS).* CI/CD for Data: Implement and maintain automated pipelines for … Use Terraform and Ansible to provision and manage data infrastructure.* Performance Optimization: Enhance data processing for speed, scalability, and reliability.What We're Looking For* Strong experience with orchestration tools (Airflow, Prefect, Dagster).* Expertise in Docker and Kubernetes.* Solid understanding of CI/CD principles and tooling.* Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL).* Knowledge of … your application with you before presenting it to any potential employer.Please note we are on the client's supplier list for this position.KeywordsLead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability More ❯
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership. Key Responsibilities * Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster. * Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS). * CI/CD for Data: Implement and maintain automated pipelines for … Terraform and Ansible to provision and manage data infrastructure. * Performance Optimization: Enhance data processing for speed, scalability, and reliability. What We're Looking For * Strong experience with orchestration tools (Airflow, Prefect, Dagster). * Expertise in Docker and Kubernetes. * Solid understanding of CI/CD principles and tooling. * Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL). * Knowledge of … you before presenting it to any potential employer. Please note we are on the client's supplier list for this position. Keywords Lead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harnham
run seamlessly within an AWS environment. Day to day, you’ll: Oversee end-to-end data operations, ensuring all loads complete successfully and identifying and resolving workflow issues within Airflow Manage and triage incident tickets, investigating discrepancies and implementing fixes to maintain data accuracy and performance Drive automation initiatives, helping to reduce manual intervention and improve system reliability Collaborate … continuous improvement within a large-scale data transformation programme Your Skills and Experience 5+ years’ experience across Data Engineering, BI or Data Operations Strong hands-on experience with AWS , Airflow , and Power BI Solid understanding of data pipelines, ETL, and debugging within cloud environments Strong problem-solving and stakeholder management skills Experience managing or working with offshore teams A More ❯
run seamlessly within an AWS environment. Day to day, you’ll: Oversee end-to-end data operations, ensuring all loads complete successfully and identifying and resolving workflow issues within Airflow Manage and triage incident tickets, investigating discrepancies and implementing fixes to maintain data accuracy and performance Drive automation initiatives, helping to reduce manual intervention and improve system reliability Collaborate … continuous improvement within a large-scale data transformation programme Your Skills and Experience 5+ years’ experience across Data Engineering, BI or Data Operations Strong hands-on experience with AWS , Airflow , and Power BI Solid understanding of data pipelines, ETL, and debugging within cloud environments Strong problem-solving and stakeholder management skills Experience managing or working with offshore teams A More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
run seamlessly within an AWS environment. Day to day, you'll: Oversee end-to-end data operations, ensuring all loads complete successfully and identifying and resolving workflow issues within Airflow Manage and triage incident tickets, investigating discrepancies and implementing fixes to maintain data accuracy and performance Drive automation initiatives, helping to reduce manual intervention and improve system reliability Collaborate … continuous improvement within a large-scale data transformation programme Your Skills and Experience 5+ years' experience across Data Engineering, BI or Data Operations Strong hands-on experience with AWS , Airflow , and Power BI Solid understanding of data pipelines, ETL, and debugging within cloud environments Strong problem-solving and stakeholder management skills Experience managing or working with offshore teams A More ❯
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership. Key Responsibilities * Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster. * Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS). * CI/CD for Data: Implement and maintain automated pipelines for … Terraform and Ansible to provision and manage data infrastructure. * Performance Optimization: Enhance data processing for speed, scalability, and reliability. What We're Looking For * Strong experience with orchestration tools (Airflow, Prefect, Dagster). * Expertise in Docker and Kubernetes. * Solid understanding of CI/CD principles and tooling. * Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL). * Knowledge of More ❯