variety of tool sets and data sources. Data Architecture experience with and understanding of data lakes, warehouses, and/or streaming platforms. Data Engineering experience with tooling, such as Apache Spark and Kafka, and orchestration tools like ApacheAirflow or equivalent. Continuous Integration/Continuous Deployment experience with CI/CD tools like Jenkins or GitLab tailored More ❯
streaming architectures, to support advanced analytics, AI, and business intelligence use cases. Proven experience in designing architectures for structured, semi-structured, and unstructured data , leveraging technologies like Databricks, Snowflake, Apache Kafka , and Delta Lake to enable seamless data processing and analytics. Hands-on experience in data integration , including designing and optimising data pipelines (batch and streaming) and integrating cloud … based platforms (e.g., Azure Synapse, AWS Redshift, Google BigQuery ) with legacy systems, ensuring performance and scalability. Deep knowledge of ETL/ELT processes , leveraging tools like ApacheAirflow, dbt, or Informatica , with a focus on ensuring data quality, lineage, and integrity across the data lifecycle. Practical expertise in data and AI governance , including implementing frameworks for data privacy More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
FENESTRA
Lead code reviews and technical design discussions, providing expert guidance Optimize systems for speed, scalability, and reliability through sophisticated engineering approaches Oversee the evolution of our data infrastructure including Airflow/Google Cloud Composer environment Apply rigorous system design thinking to create elegant, efficient technical solutions Commercial & Client Engagement Represent Fenestra's technical capabilities to clients, partners, and at … production-quality code Very deep proficiency in Python and modern software engineering practices Advanced SQL knowledge with experience in data modelling and optimization Extensive experience with ETL frameworks, particularly ApacheAirflow (3+ years) Strong understanding of CI/CD, Infrastructure as Code (Terraform), and containerization Experience designing and optimizing big data processing systems particularly using Bigquery/GCP More ❯
Candidates must hold active SC Clearance. £75,000 - £85,000 Remote - with occasional client visits Skills : Python and SQL . ApacheAirflow for DAG orchestration and monitoring. Docker for containerisation. AWS data services: Redshift , OpenSearch , Lambda , Glue , Step Functions , Batch . CI/CD pipelines and YAML-based configuration More ❯
databases, data warehousing, and big data (Hadoop, Spark). Proficient in Python, Java, or Scala with solid OOP and design pattern understanding. Expertise in ETL tools and orchestration frameworks (Airflow, Apache NiFi). Hands-on experience with cloud platforms (AWS, Azure, or GCP) and their data services. More ❯
SQL scripting. Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch. Strong programming skills in Python or Scala for data processing. Experience with orchestration tools like ApacheAirflow or AWS Step Functions. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. We are an equal opportunity employer. All aspects of employment including More ❯
in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT and data modeling Nice to have: Iceberg/Delta/Hudi, Airflow/Dagster/Prefect, Python/TypeScript, data governance. More ❯
/frameworks (pandas, numpy, sklearn, TensorFlow, PyTorch) Strong understanding and experience in implementing end-to-end ML pipelines (data, training, validation, serving) Experience with ML workflow orchestration tools (e.g., Airflow, Prefect, Kubeflow) and ML feature or data platforms (e.g., Tecton, Databricks, etc.) Experience with cloud platforms (AWS, GCP/Vertex, Azure), Docker, and Kubernetes Solid coding practices (Git, automated More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
excellence across your team. Lead by example — fostering collaboration, accountability, and agile delivery in every sprint. 🧠 What You Bring Expertise in AWS : Hands-on experience with Python, Glue, S3, Airflow, DBT, Redshift, and RDS. Proven success in end-to-end data engineering — from ingestion to insight. Strong leadership and communication skills, with a collaborative, solution-driven mindset. Experience working More ❯
excellence across your team. Lead by example — fostering collaboration, accountability, and agile delivery in every sprint. 🧠 What You Bring Expertise in AWS : Hands-on experience with Python, Glue, S3, Airflow, DBT, Redshift, and RDS. Proven success in end-to-end data engineering — from ingestion to insight. Strong leadership and communication skills, with a collaborative, solution-driven mindset. Experience working More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Morson Edge
engineers, analysts and client teams to deliver value-focused data solutions What We’re Looking For: Strong experience with Python, SQL, Spark and pipeline tools such as dbt or Airflow Hands-on experience with at least one major cloud platform (AWS, Azure, or GCP) A solid grasp of data modelling, data warehousing and performance optimisation Passion for data quality More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Altech Group Ltd
data landscape. Explore new data sources and technologies to enhance the team’s capabilities. Why This Role Stands Out Work with a cutting-edge modern data stack: Snowflake, dbt, Airflow, and Azure. Be part of a small team, where due to the size you will have huge ownership Opportunity to work on varied datasets and apply your web scraping More ❯
GCP) and data lake technologies (e.g., S3, ADLS, HDFS). Expertise in containerization and orchestration tools like Docker and Kubernetes. Knowledge of MLOps frameworks and tools (e.g., MLflow, Kubeflow, Airflow). Experience with real-time streaming architectures (e.g., Kafka, Kinesis). Familiarity with Lambda and Kappa architectures for data processing. Enable integration capabilities for external tools to perform ingestion More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Higher - AI recruitment
stacks and cloud technologies (AWS, Azure, or GCP). Expert knowledge of Python and SQL Hands-on experiences with Data Architecture, including: Cloud platforms and orchestration tools (e.g. Dagster, Airflow) AI/MLOps: Model deployment, monitoring, lifecycle management. Big Data Processing: Spark, Databricks, or similar. Bonus: Knowledge Graph engineering, graph databases, ontologies. Located in London And ideally you... Are More ❯
stacks and cloud technologies (AWS, Azure, or GCP). Expert knowledge of Python and SQL Hands-on experiences with Data Architecture, including: Cloud platforms and orchestration tools (e.g. Dagster, Airflow) AI/MLOps: Model deployment, monitoring, lifecycle management. Big Data Processing: Spark, Databricks, or similar. Bonus: Knowledge Graph engineering, graph databases, ontologies. Located in London And ideally you... Are More ❯
technology stack Python and associated ML/DS libraries (scikit-learn, NumPy, LightGBM, Pandas, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOp/DevOps: Terraform, Docker, Airflow, MLFlow, NewRelic The interview process Recruiter Call (30 minutes) Meeting a Machine Learning Manager(30 minutes) Technical Interview with 2 x Engineers (90 mins) Final Interview with the Head More ❯