Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
london (city of london), south east england, united kingdom
Capgemini
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
london, south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
slough, south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
of the below but we would love for you to have experience in some of the following areas: Proficiency in modern data engineering practices and technologies, including Prefect/Airflow, Python, dbt, Kubernetes, Kafka or similar Experience with Infrastructure as Code (IaC) and cloud-based services e.g deploying infrastructure on AWS using Terraform A deep understanding of data pipelines More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
communicate technical solutions clearly to non-technical stakeholders Technical skills (a big plus): Knowledge of deep learning frameworks (PyTorch, TensorFlow), transformers, or LLMs Familiarity with MLOps tools (MLflow, SageMaker, Airflow, etc.) Experience with streaming data (Kafka, Kinesis) and distributed computing (Spark, Dask) Skills in data visualization apps (Streamlit, Dash) and dashboarding (Tableau, Looker) Domain experience in forecasting, optimisation, or More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
in data engineering – Python, SQL, and cloud platforms (GCP or Azure). Experience designing or managing data pipelines and ETL processes . Exposure to orchestration tools such as dbt , Airflow , or Azure Data Factory . Good understanding of data architecture and automation . AdTech or MarTech experience would be a real plus. Confident mentoring and developing junior engineers/ More ❯
Coventry, West Midlands, England, United Kingdom Hybrid / WFH Options
Lorien
executing tests Requirements Strong experience as a Data Engineer (migrating legacy systems onto AWS, building data pipelines) Strong Python experience Tech stack experience required: AWS Glue, Redshift, Lambda, PySpark, Airflow SSIS or SAS experience (Desirable) Benefits Salary up to £57,500 + up to 20% bonus Hybrid working: 1 to 2 days a week in the office 28 days More ❯
background at a leading finance or big tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
london (city of london), south east england, united kingdom
Radley James
background at a leading finance or big tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
background at a leading finance or big tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in More ❯
checks Understanding of agile software delivery and collaborative development Nice to Have: Experience with bioinformatics or large-scale biological data (e.g., genomics, proteomics) Familiarity with orchestration tools such as Airflow or Google Workflows Experience with containerisation (Docker) Exposure to NLP, unstructured data processing, or vector databases Knowledge of ML and AI-powered data products What You'll Bring Strong More ❯
Southampton, Hampshire, South East, United Kingdom
Spectrum It Recruitment Limited
years in data science or analytics, including ETL experience from GA4/Google Ads and CRM platforms. Advanced SQL & Python skills. Experience with dbt (or equivalent) and orchestration tools (Airflow/Cloud Composer). Proven experience using LLMs (OpenAI, Gemini, Claude, etc.) for analytics and automation. Strong background in data visualisation (Power BI/Looker/Looker Studio). More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
Strong Python skills and comfort working across complex ingestion workflows Experience managing NoSQL and vector databases at scale (MongoDB, Weaviate, Pinecone, etc.) Solid understanding of modern data pipeline tools (Airflow, Prefect, Dagster) Practical experience with LLM development, embeddings, and RAG architectures Familiarity with distributed systems and cloud platforms (AWS, GCP, or Azure) Self-motivated and capable of independently delivering More ❯
Strong Python skills and comfort working across complex ingestion workflows Experience managing NoSQL and vector databases at scale (MongoDB, Weaviate, Pinecone, etc.) Solid understanding of modern data pipeline tools (Airflow, Prefect, Dagster) Practical experience with LLM development, embeddings, and RAG architectures Familiarity with distributed systems and cloud platforms (AWS, GCP, or Azure) Self-motivated and capable of independently delivering More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Noir
Data Engineer - FinTech - Newcastle (Tech stack: Data Engineer, SQL, Python, AWS, Git, Airflow, Data Pipelines, Data Platforms, Programmer, Developer, Architect, Data Engineer) Our client is a trailblazer in the FinTech space, known for delivering innovative technology solutions to global financial markets. They are expanding their engineering capability in Newcastle and are looking for a talented Data Engineer to join More ❯
london, south east england, united kingdom Hybrid / WFH Options
Montash
S3 and Redshift, particularly around storage, computation and security. Familiarity with modern BI tools such as Power BI or AWS QuickSight. Experience with open-source data stack tools like Airflow, DBT, Airbyte or similar. Strong grasp of software development best practices and CI/CD processes. Skilled in performance tuning, testing, and automation within data engineering environments. Excellent communication More ❯