e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
excellence across your team. Lead by example — fostering collaboration, accountability, and agile delivery in every sprint. 🧠 What You Bring Expertise in AWS : Hands-on experience with Python, Glue, S3, Airflow, DBT, Redshift, and RDS. Proven success in end-to-end data engineering — from ingestion to insight. Strong leadership and communication skills, with a collaborative, solution-driven mindset. Experience working More ❯
excellence across your team. Lead by example — fostering collaboration, accountability, and agile delivery in every sprint. 🧠 What You Bring Expertise in AWS : Hands-on experience with Python, Glue, S3, Airflow, DBT, Redshift, and RDS. Proven success in end-to-end data engineering — from ingestion to insight. Strong leadership and communication skills, with a collaborative, solution-driven mindset. Experience working More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Morson Edge
engineers, analysts and client teams to deliver value-focused data solutions What We’re Looking For: Strong experience with Python, SQL, Spark and pipeline tools such as dbt or Airflow Hands-on experience with at least one major cloud platform (AWS, Azure, or GCP) A solid grasp of data modelling, data warehousing and performance optimisation Passion for data quality More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Altech Group Ltd
data landscape. Explore new data sources and technologies to enhance the team’s capabilities. Why This Role Stands Out Work with a cutting-edge modern data stack: Snowflake, dbt, Airflow, and Azure. Be part of a small team, where due to the size you will have huge ownership Opportunity to work on varied datasets and apply your web scraping More ❯
GCP) and data lake technologies (e.g., S3, ADLS, HDFS). Expertise in containerization and orchestration tools like Docker and Kubernetes. Knowledge of MLOps frameworks and tools (e.g., MLflow, Kubeflow, Airflow). Experience with real-time streaming architectures (e.g., Kafka, Kinesis). Familiarity with Lambda and Kappa architectures for data processing. Enable integration capabilities for external tools to perform ingestion More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
stacks and cloud technologies (AWS, Azure, or GCP). Expert knowledge of Python and SQL Hands-on experiences with Data Architecture, including: Cloud platforms and orchestration tools (e.g. Dagster, Airflow) AI/MLOps: Model deployment, monitoring, lifecycle management. Big Data Processing: Spark, Databricks, or similar. Bonus: Knowledge Graph engineering, graph databases, ontologies. Located in London And ideally you... Are More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Higher - AI recruitment
stacks and cloud technologies (AWS, Azure, or GCP). Expert knowledge of Python and SQL Hands-on experiences with Data Architecture, including: Cloud platforms and orchestration tools (e.g. Dagster, Airflow) AI/MLOps: Model deployment, monitoring, lifecycle management. Big Data Processing: Spark, Databricks, or similar. Bonus: Knowledge Graph engineering, graph databases, ontologies. Located in London And ideally you... Are More ❯
technology stack Python and associated ML/DS libraries (scikit-learn, NumPy, LightGBM, Pandas, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOp/DevOps: Terraform, Docker, Airflow, MLFlow, NewRelic The interview process Recruiter Call (30 minutes) Meeting a Machine Learning Manager(30 minutes) Technical Interview with 2 x Engineers (90 mins) Final Interview with the Head More ❯
designing and implementing robust, scalable, and efficient data systems that power analytics, machine learning models, and business insights. The ideal candidate will have expertise in data pipeline orchestration (e.g., Airflow), data lake and warehouse architecture and development, infrastructure as code (IaC) using Terraform , and data extraction from both structured and unstructured data sources (e.g. websites). Knowledge using the … and machine learning workflows. Design and implement scalable, secure, and high-performance data lake and data warehouse solutions. Pipeline Orchestration: Develop, monitor, and optimize ETL/ELT workflows using Apache Airflow. Ensure data pipelines are robust, error-tolerant, and scalable for real-time and batch processing. Data Scraping & Unstructured Data Processing: Develop and maintain scalable web scraping solutions to … or a related field; or equivalent professional experience. Experience: 5+ years of experience in data engineering or a related field. Strong expertise in data pipeline orchestration tools such as ApacheAirflow . Proven track record of designing and implementing data lakes and warehouses (experience with Azure is a plus). Demonstrated experience with Terraform for infrastructure provisioning and More ❯
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experie click apply for full job details More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
data storytelling and operational insights. Optimise data workflows across cloud and on-prem environments, ensuring performance and reliability. Skills & Experience: Strong experience in ETL pipeline development using tools like ApacheAirflow, Informatica, or similar. Advanced SQL skills and experience with large-scale relational and cloud-based databases. Hands-on experience with Tableau for data visualisation and dashboarding. Exposure More ❯
experience in a commercial environment, working on AI/ML applications Multi cloud exposure (Azure/AWS/GCP) . Some of the following - Pytorch, GPT/BERT, RAG, ApacheAirflow, Power Automate, Azure logic apps, RPA/Zapier, HuggingFace, LangChain... Background in Data Science or Software Engineering The values and ethos of this business Innovation with real More ❯
United Kingdom, Wolstanton, Staffordshire Hybrid / WFH Options
Uniting Ambition
experience in a commercial environment, working on AI/ML applications Multi cloud exposure (Azure/AWS/GCP) . Some of the following - Pytorch, GPT/BERT, RAG, ApacheAirflow, Power Automate, Azure logic apps, RPA/Zapier, HuggingFace, LangChain... Background in Data Science or Software Engineering The values and ethos of this business Innovation with real More ❯
Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP OB have partnered with a UK FinTech Unicorn, who's Data function is undergoing rapid growth, and in-turn are looking to grow their Data team, with 2 highly skilled Data Engineers. You'll be working on shaping the companies Data function, driving Data best practices, and More ❯
than reactive. Tech Stack Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration Tool: Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal data, please refer to HCLTech More ❯
than reactive. Tech Stack Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration Tool: Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal data, please refer to HCLTech More ❯
in a modern, cloud-first environment with real autonomy — designing and building data pipelines, improving infrastructure, and driving innovation across the data platform. Tech Stack: 🔹 Snowflake 🔹 Python (FastAPI, Pydantic) 🔹 Airflow/Dagster 🔹 AWS + Terraform 🔹 CI/CD (GitHub Actions) (Bonus: Kafka, dbt/SQLMesh, data quality & governance tools) What’s on Offer: 💰 Up to £110,000 + annual More ❯