Engineer , Data Manager, or similar leadership role. Strong proficiency in Python (or Scala/Java) and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of More ❯
and able to contribute quickly in a fast-moving environment. Nice to Have Experience with Power BI or other data visualisation tools. Familiarity with orchestration tools such as Airflow, Prefect, or Dagster. Understanding of CI/CD practices in data and analytics engineering. Knowledge of data governance, observability, and security best practices in cloud environments. More ❯
performance in production Automation Architecture Deep knowledge of automation tools including GitHub Actions, Terraform, and Ansible Experience with business process automation (RPA) tools like Appian Workflow orchestration experience (Airflow, Prefect) Ability to build custom automation frameworks using Python or similar languages Full-Stack Development Solid software engineering background with proficiency in Python, JavaScript/TypeScript, Java, or Go Experience with More ❯
data warehousing and transformation. Strong SQL skills and understanding of modern data architecture principles. Hands-on experience with Tableau for enterprise-grade dashboard development. Familiarity with orchestration tools (e.g., Prefect, Airflow), data quality frameworks, and metadata tools. Proficiency in Git, CI/CD, and scripting with Python. Excellent communication skills and ability to work collaboratively across technical and business teams. More ❯
data warehousing and transformation. Strong SQL skills and understanding of modern data architecture principles. Hands-on experience with Tableau for enterprise-grade dashboard development. Familiarity with orchestration tools (e.g., Prefect, Airflow), data quality frameworks, and metadata tools. Proficiency in Git, CI/CD, and scripting with Python. Excellent communication skills and ability to work collaboratively across technical and business teams. More ❯
years' experience as a Data Engineer Strong SQL and Python skills Proven experience building modern ETL/ELT pipelines (dbt experience ideal) Experience with data orchestration tools (Prefect preferred) Understanding of data modelling , especially event-driven architectures Knowledge of modern data engineering development practices Nice to have: Background in InsurTech/FinTech or regulated industries Experience with Docker , containerisation, and More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
data workflows, optimize performance, and ensure reliability across cloud platforms-while guiding teams with strong technical leadership.Key Responsibilities* Pipeline Automation: Build and orchestrate data workflows using tools like Airflow, Prefect, or Dagster.* Containerization: Package and deploy data applications using Docker and Kubernetes (including EKS and AKS).* CI/CD for Data: Implement and maintain automated pipelines for data applications. … Terraform and Ansible to provision and manage data infrastructure.* Performance Optimization: Enhance data processing for speed, scalability, and reliability.What We're Looking For* Strong experience with orchestration tools (Airflow, Prefect, Dagster).* Expertise in Docker and Kubernetes.* Solid understanding of CI/CD principles and tooling.* Familiarity with open-source data technologies (Spark, Kafka, PostgreSQL).* Knowledge of Infrastructure as … application with you before presenting it to any potential employer.Please note we are on the client's supplier list for this position.KeywordsLead DataOps Engineer, DataOps, Data Pipeline Automation, Airflow, Prefect, Dagster, Docker, Kubernetes, EKS, AKS, CI/CD, Terraform, Ansible, Grafana, Prometheus, Spark, Kafka, PostgreSQL, Infrastructure as Code, Cloud Data Engineering, Hybrid Working, Security Clearance, Leadership, DevOps, Observability, Monitoring. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
needed). The Role Build and optimise scalable data pipelines for a new Data Lake Work with Python, SQL, Spark, AWS, Docker, CI/CD Orchestrate workflows using Airflow, Prefect, or Dagster Support ETL, API integrations, and high-quality data validation Translate business needs into technical solutions with full ownership What You Need STEM degree (Computer Science, Engineering, Maths, etc. … years in data engineering/analytics/software Strong Python + SQL 2+ years ETL, APIs, CI/CD, Docker, AWS Ideally but not essential experience with Airflow/Prefect/Dagster Background in Financial Services, FinTech, Insurance, PE/VC, or Banking Package & Benefits High performance bonus 25 days holiday Pension & private healthcare Join at inception - shape the team More ❯