including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas … foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data More ❯
Our Commodities and Global Markets data analytics team within technology is building a leading data and analytics platform. We partner and build batch and real-time analytics solutions for traders, quants, desk analysts, and central data teams across the business. More ❯
We're looking for a Senior Data Engineer to join Pleo and help us in our journey in our Business Analytics team. This team is responsible for delivering and enhancing high-quality, robust data solutions that drive commercial performance, revenue More ❯
Technical requirements: Highly proficient in Python. Experience working with data lakes; experience with Spark, Databricks. Understanding of common data transformation and storage formats, e.g. Apache Parquet. Good understanding of cloud environments (ideally Azure), and workflow management systems (e.g. Dagster, Airflow, Prefect). Follow best practices like code review More ❯
one or multiple systems. You know how to create repeatable and reusable products. Experience with workflow management tools such as Nextflow, WDL/Cromwell, Airflow, Prefect and Dagster Good understanding of cloud environments (ideally Azure), distributed computing and scaling workflows and pipelines Understanding of common data transformation and storage … formats, e.g. Apache Parquet Awareness of data standards such as GA4GH ( ) and FAIR ( ). Exposure of genotyping and imputation is highly advantageous Benefits: Competitive base salary Generous Pension Scheme - We invest in your future with employer contributions of up to 12%. 30 Days Holiday + Bank Holidays - Enjoy More ❯
including CI/CD, observability, version control, and testing Architect and evolve our data platform, including data warehousing (Redshift), lakehouse (Databricks), and orchestration tools (Airflow, Step Functions) Lead data governance, cataloging, compliance, and security efforts to ensure trustworthy and well-managed data Essential skills & experience: Proven leadership or management … and implementing scalable data platforms and ETL/ELT pipelines Knowledge of data warehousing, data lake architectures, and orchestration tools like Step Functions and Airflow Experience with infrastructure as code (e.g., Terraform) Understanding of data governance and quality practices Strong communication skills to explain technical concepts and influence stakeholders More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
tools and tech. The team are using Databricks and AWS and theyre keen for someone whos worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. Youll also work closely with analysts, scientists and other business teams, so youll need to be able to explain complex … modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure DevOps or similar) Experience working in an Agile environment A proactive mindset you More ❯
bristol, south west england, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
tech. The team are using Databricks and AWS and they’re keen for someone who’s worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. You’ll also work closely with analysts, scientists and other business teams, so you’ll need to be able to … modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure DevOps or similar) Experience working in an Agile environment A proactive mindset — you More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis combines the expertise of More ❯
Senior Data Engineer Hybrid working (3 days per week onsite in London) 6 Month contract initially, with good scope for extension Market rates (Umbrella-PAYE) One of our blue chip clients is looking for a Senior Data Engineer to join More ❯
On-site, Hybrid Manchester, Manchester, United Kingdom QA We are Xebia - a place where experts grow. For nearly two decades now, we've been developing digital solutions for clients from many industries and places across the globe. Among the brands More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Nigel Frank International
Lead Data Engineer - Snowflake, DBT, Airflow - London - Up to £100k I'm working with a key client of ours here at TRG who are looking to grow out their Data & Analytics function. My client are globally renowned for being a leader within their relative field. Whilst they are a … So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience This is a great opportunity to join outstanding organisation who pride themselves on … touch ASAP. Send across your CV to (url removed) or alternatively, give me a call on (phone number removed). Keywords: Snowflake, DBT, SQL, Airflow, AWS, Engineer, DWH, Data Warehouse, Data Warehousing, Architecture, London More ❯
Experience developing & maintaining production ML services Experience with ad-hoc analytics, data visualisation, and BI tools (Superset, Redash, Metabase) Experience with workflow orchestration tools (Airflow, Prefect) Experience writing data processing pipelines & ETL (Python, Apache Spark) Excellent communication skills and ability to work collaboratively in a team environment Experience More ❯
Experience developing & maintaining production ML services Experience with ad-hoc analytics, data visualisation, and BI tools (Superset, Redash, Metabase) Experience with workflow orchestration tools (Airflow, Prefect) Experience writing data processing pipelines & ETL (Python, Apache Spark) Excellent communication skills and ability to work collaboratively in a team environment Experience More ❯
robust version control Setting up monitoring and alerting frameworks to track model drift, data quality, and inference health Leveraging orchestration tools such as Dagster, Airflow, or Prefect to manage and scale ML workflows Supporting ongoing infrastructure migration or optimisation initiatives (e.g. improving cost efficiency, latency, or reliability) Partnering with … experience deploying ML models into production environments, including both batch and real-time/streaming contexts Proficiency working with distributed computing frameworks such as Apache Spark , Dask, or similar Experience with cloud-native ML deployment , particularly on AWS , using services like ECS, EKS, Fargate, Lambda, S3, and more Familiarity … with orchestration and workflow scheduling tools such as Dagster , Airflow , or Prefect Knowledge of CI/CD best practices and tools (e.g. GitHub Actions, Jenkins, CodePipeline) Exposure to monitoring and observability tools for ML systems (e.g. Prometheus, Grafana, DataDog, WhyLabs, Evidently, etc.) Experience in building parallelised or distributed model More ❯
technology infrastructure driving the ecosystem that enables the business to make smarter decisions with data. Our primary technology is Google Cloud Platform supported by Airflow and dbt. As technical lead on the team you'll be working closely with architects, product managers and stakeholders, helping to provide technical knowledge … ecosystem, including maintaining a lakehouse on Google Cloud Platform with BigQuery at its core Deliver operations and orchestration for all things data, specifically on Airflow, Cloud Composer Support integrations with key platforms such as Fivertan, Tableau, and Alation Enhance our architecture, align it with our infosec, multi-cloud, and … BigQuery preferred. Design and deliver solutions on the technologies that drive data at the Guardian. This looks like: Google Cloud Platform: BigQuery, Kubernetes Python, Airflow, & DBT CI/CD & Terraform Nice-to-have : Spark & Scala on DataProc Demonstrated knowledge and experience of key operational concerns, including: Security & Networks Site More ❯
the setup; Sharing, enhancing and upskilling team members on available tools and best practices. What you'll need Strong aptitude with SQL, Python and Airflow; Experience in Kubernetes, Docker, Django, Spark and related monitoring tools for DevOps a big plus (e.g. Grafana, Prometheus); Experience with dbt for pipeline modeling … based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on Kubernetes and Docker Django for custom app/database development Kubernetes for container management, with Grafana/ More ❯
/Tools we use: Python, Azure (Virtual Machines, Azure Web Apps, Cloud Storage, Azure ML), Anaconda packages, Git, GitHub, GitHub Actions, Terraform, SQL, Artifactory, Airflow, Docker, Kubernetes, Linux/Windows VMs. About You: Hands-on industry experience in some combination of Software Engineering, ML Engineering, Data Science, DevOps, and … necessary. Hands-on industry experience in some combination of the following technologies: Python ecosystem, Azure (VMs, Web Apps, Managed Databases), GitHub Actions, Terraform, Packer, Airflow, Docker, Kubernetes, Linux/Windows VM administration, Shell scripting (primary Bash but PowerShell as well). A solid understanding of modern security and networking More ❯
/Tools we use: Python, Azure (Virtual Machines, Azure Web Apps, Cloud Storage, Azure ML), Anaconda packages, Git, GitHub, GitHub Actions, Terraform, SQL, Artifactory, Airflow, Docker, Kubernetes, Linux/Windows VMs. About You Hands-on industry experience in some combination of Software Engineering, ML Engineering, Data Science, DevOps, and … necessary. Hands-on industry experience in some combination of the following technologies: Python ecosystem, Azure (VMs, Web Apps, Managed Databases), GitHub Actions, Terraform, Packer, Airflow, Docker, Kubernetes, Linux/Windows VM administration, Shell scripting (primary Bash but PowerShell as well). A solid understanding of modern security and networking More ❯
that support core trading, forecasting, risk, and PPA processes across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure … code standardization Take ownership of data platform improvements Share knowledge and upskill team members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design More ❯
World Wide Technology (WWT), a global technology integrator and IT solutions provider. World Wide Technology, established in 1990 in St. Louis, Missouri, collaborates with OEMs like Cisco and Dell EMC to offer infrastructure security and custom app development services to More ❯
engineers, providing mentorship and technical guidance. Taking ownership of the health and stability of a widely used data platform. Managing and improving a heavy Airflow estate within a GCP-based environment. Ensuring robust engineering practices and continuous platform improvement. Tech Stack: GCP and Airflow, (essential) dbt, Terraform, Kubernetes More ❯
Revolutionising Healthcare Through Machine Learning. The field of machine learning has advanced far beyond the futuristic ideas portrayed in Sci-Fi films like Robocop and Transformers. Today, machine learning represents a groundbreaking tool in healthcare and pharmaceuticals, offering solutions to More ❯
Machine Learning Engineer A long way has come since Sci-Fi films like Terminator and iRobot got us thinking about robots taking over the world. But in the 2020s, hearing AI and machines in the same sentence means a world More ❯