with a track record of leading effective agile and lean software teams. Have a strong background in DevOps deploying, managing and maintaining services using Airflow, Docker, Terraform and AWS CLI tools to achieve infrastructure-as-code and automated deployments. Have excellent knowledge of AWS services (ECS, IAM, EC2, S3 … DynamoDB, MSK). Our Technology Stack: Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks More ❯
automation. Working alongside experienced senior engineers, you'll bring a data engineering mindset to the team, building sophisticated systems that parallel orchestration tools like Airflow or Temporal. Rather than creating individual pipelines, you'll develop the frameworks and tools that allow users to create their own pipelines efficiently, while … the opportunity to work on Cloud Infrastructure, whether it be AWS, Azure or GCP. You've got experience with orchestration frameworks such as Temporal, Airflow or Dagster. You've had the opportunity to and enjoyed being part of a fast-paced and growing Software Engineering company. You're not More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Aerospace Corporation
any domain Experience developing and deploying ML applications/solutions Experience with at least two MLOPs and Data Engineering tools (e.g., MLFlow, Kubeflow, Neptune, Airflow, DVC) Software development skills in at least two different programming languages (e.g. Python, R, C/C++), including familiarity with machine learning and deep … with big data frameworks (Hadoop, Spark, Flink etc.) Experience with ML lifecycle management tools (MLflow, Kubeflow, etc.) Familiarity with data pipelining and streaming technologies (Apache Kafka, Apache Nifi, etc.) Demonstrated contributions to open-source software repositories (github, kaggle, etc.) Experience deploying ML models on cloud platforms (AWS, Azure More ❯
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventis Solutions
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like Apache Spark and Flink for scalable performance. Infrastructure Automation: Implement Infrastructure … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
data. Libraries Tools : Terraform, Flask, Pandas, FastAPI, Dagster, GraphQL, SQLAlchemy, GitLab, Athena. Your Trusted Companions : Docker, Snowflake, MongoDB, Relational Databases (eg MySQL, PostgreSQL), Dagster, Airflow/Luigi, Spark, Kubernetes. Your AWS Kingdom : Lambda, Redshift, EC2, ELB, IAM, RDS, Route53, S3-the building blocks of cloud mastery. Your Philosophy : Continuous … or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc. Experience in building ETL/ELT processes and data pipelines with platforms like Airflow, Dagster, or Luigi. What's important for us: Academically Grounded : Bachelor's or Master's degree in Computer Science, Data Engineering, or related field. More ❯
companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. … Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions); You're proficient in SQL and Python , using them to transform and optimize data like a pro; You know … at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for learning More ❯
Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Deep understanding of end-to-end pipeline design and implementation. Attention to detail and quality with excellent problem solving and interpersonal skills Preferred Qualifications … audience. Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Innate curiosity about consumer behavior and technology Experience with event messaging frameworks like Apache Kafka A fan of movies and television is a strong plus. Required Education Bachelor's degree in Computer Science, Information Systems, Software, Electrical or More ❯
of 5). Salary: £130,000 – £150,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places … teams Leading a small team of 5 data engineers What you’ll bring: Strong leadership experience in data engineering Deep expertise with AWS, Snowflake, Airflow, and DBT A pragmatic, product-first approach to building data systems Excellent communication and stakeholder management skills Solid understanding of agile data development lifecycles More ❯
Northern Ireland, United Kingdom Hybrid / WFH Options
Ocho
data integration pipelines • Collaborate with cross-functional teams on architecture and system design • Interface with frontend engineers (Vue.js stack) and data platform engineers (Snowflake, Airflow) • Help define and implement CI/CD pipelines and observability practices • Support a feedback-driven culture focused on iterative delivery and continuous improvement • Communicate … environments • Excellent communication skills and ability to work across regions Desirable Experience: • Experience with time-series data and cost intelligence platforms • Familiarity with Snowflake, Airflow, DBT, or other data platform technologies • Background in data-centric product development or FinOps tooling • Exposure to ML-powered querying, analytics, or optimization • Experience More ❯
united kingdom, united kingdom Hybrid / WFH Options
Ocho
data integration pipelines • Collaborate with cross-functional teams on architecture and system design • Interface with frontend engineers (Vue.js stack) and data platform engineers (Snowflake, Airflow) • Help define and implement CI/CD pipelines and observability practices • Support a feedback-driven culture focused on iterative delivery and continuous improvement • Communicate … environments • Excellent communication skills and ability to work across regions Desirable Experience: • Experience with time-series data and cost intelligence platforms • Familiarity with Snowflake, Airflow, DBT, or other data platform technologies • Background in data-centric product development or FinOps tooling • Exposure to ML-powered querying, analytics, or optimization • Experience More ❯
We're looking for a Senior Data Engineer to join Pleo and help us in our journey in our Business Analytics team. This team is responsible for delivering and enhancing high-quality, robust data solutions that drive commercial performance, revenue More ❯
Technical requirements: Highly proficient in Python. Experience working with data lakes; experience with Spark, Databricks. Understanding of common data transformation and storage formats, e.g. Apache Parquet. Good understanding of cloud environments (ideally Azure), and workflow management systems (e.g. Dagster, Airflow, Prefect). Follow best practices like code review More ❯
one or multiple systems. You know how to create repeatable and reusable products. Experience with workflow management tools such as Nextflow, WDL/Cromwell, Airflow, Prefect and Dagster Good understanding of cloud environments (ideally Azure), distributed computing and scaling workflows and pipelines Understanding of common data transformation and storage … formats, e.g. Apache Parquet Awareness of data standards such as GA4GH ( ) and FAIR ( ). Exposure of genotyping and imputation is highly advantageous Benefits: Competitive base salary Generous Pension Scheme - We invest in your future with employer contributions of up to 12%. 30 Days Holiday + Bank Holidays - Enjoy More ❯
native tools and platforms. What You'll Be Doing: Build and optimise ETL/ELT pipelines across structured and unstructured data sources using Airbyte , Airflow , DBT Core , and AWS Glue Design and maintain dimensional models in Snowflake , including SCDs and best practices for indexing, clustering, and performance Collaborate cross … serve analytics Implement best practices in data governance , including data quality checks, lineage tracking, and anomaly detection Automate data orchestration using tools such as Airflow , Lambda , or Step Functions Support financial and operational reporting through snapshot tables and audit-friendly data structures What We're Looking For: Strong understanding … transformation and query tuning Deep knowledge of Snowflake including optimisation, cost management, and architecture Experience with modern data stacks – especially DBT Core , Airbyte , and Airflow Familiarity with AWS data services (e.g., S3, Lambda, Step Functions) Proven ability to support scalable reporting frameworks and drive data reliability Bonus Points For More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
tools and tech. The team are using Databricks and AWS and theyre keen for someone whos worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. Youll also work closely with analysts, scientists and other business teams, so youll need to be able to explain complex … modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure DevOps or similar) Experience working in an Agile environment A proactive mindset you More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
tech. The team are using Databricks and AWS and they’re keen for someone who’s worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. You’ll also work closely with analysts, scientists and other business teams, so you’ll need to be able to … modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure DevOps or similar) Experience working in an Agile environment A proactive mindset — you More ❯
Senior Data Engineer Hybrid working (3 days per week onsite in London) 6 Month contract initially, with good scope for extension Market rates (Umbrella-PAYE) One of our blue chip clients is looking for a Senior Data Engineer to join More ❯
quality of the work you deliver Furthermore, you have experience in: working with AWS developing applications in a Kubernetes environment developing batch jobs in Apache Spark (pyspark or Scala) and scheduling them in an Airflow environment developing streaming applications for Apache Kafka in Python or Scala working More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Nigel Frank International
Lead Data Engineer - Snowflake, DBT, Airflow - London - Up to £100k I'm working with a key client of ours here at TRG who are looking to grow out their Data & Analytics function. My client are globally renowned for being a leader within their relative field. Whilst they are a … So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience This is a great opportunity to join outstanding organisation who pride themselves on … touch ASAP. Send across your CV to (url removed) or alternatively, give me a call on (phone number removed). Keywords: Snowflake, DBT, SQL, Airflow, AWS, Engineer, DWH, Data Warehouse, Data Warehousing, Architecture, London More ❯
flows through the pipeline. Collaborate with research to define data quality benchmarks . Optimize end-to-end performance across distributed data processing frameworks (e.g., Apache Spark, Ray, Airflow). Work with infrastructure teams to scale pipelines across thousands of GPUs . Work directly with the leadership on the … optimizing classifiers. Experience managing large-scale datasets and pipelines in production. Experience in managing and leading small teams of engineers. Expertise in Python , Spark , Airflow , or similar data frameworks. Understanding of modern infrastructure: Kubernetes , Terraform , object stores (e.g. S3, GCS) , and distributed computing environments. Strong communication and leadership skills More ❯
robust version control Setting up monitoring and alerting frameworks to track model drift, data quality, and inference health Leveraging orchestration tools such as Dagster, Airflow, or Prefect to manage and scale ML workflows Supporting ongoing infrastructure migration or optimisation initiatives (e.g. improving cost efficiency, latency, or reliability) Partnering with … experience deploying ML models into production environments, including both batch and real-time/streaming contexts Proficiency working with distributed computing frameworks such as Apache Spark , Dask, or similar Experience with cloud-native ML deployment , particularly on AWS , using services like ECS, EKS, Fargate, Lambda, S3, and more Familiarity … with orchestration and workflow scheduling tools such as Dagster , Airflow , or Prefect Knowledge of CI/CD best practices and tools (e.g. GitHub Actions, Jenkins, CodePipeline) Exposure to monitoring and observability tools for ML systems (e.g. Prometheus, Grafana, DataDog, WhyLabs, Evidently, etc.) Experience in building parallelised or distributed model More ❯
Title: Data Scientist Salary: £60,000-£67,500 + Benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT The Company: Immersum are supporting the growth leading PropTech company on a mission to revolutionise how the property sector understands people, places, and … ML models Strong analytical background with applied stats, EDA, and model validation techniques Confidence working with structured data pipelines and modern tooling (AWS, Snowflake, Airflow, DBT) Curiosity for emerging techniques and an eagerness to learn and innovate Excellent communication skills, especially when simplifying complex findings for non-technical teams More ❯
london, south east england, united kingdom Hybrid / WFH Options
Immersum
Title: Data Scientist Salary: £60,000-£67,500 + Benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT The Company: Immersum are supporting the growth leading PropTech company on a mission to revolutionise how the property sector understands people, places, and … ML models Strong analytical background with applied stats, EDA, and model validation techniques Confidence working with structured data pipelines and modern tooling (AWS, Snowflake, Airflow, DBT) Curiosity for emerging techniques and an eagerness to learn and innovate Excellent communication skills, especially when simplifying complex findings for non-technical teams More ❯