IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
Databricks Engineer London- hybrid- 3 days per week on-site 6 Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with … sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). ApacheAirflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault More ❯
data engineering Strong Python & SQL skills Experience with Google Cloud or similar Familiarity with unit testing , Git , and Agile Desirable: Experience with bioinformatics or scientific datasets Knowledge of Nextflow , Airflow , NLP , and Docker Exposure to AI/ML applications If the role aligns with your skills and experience, please apply with your updated CV More ❯
data engineering Strong Python & SQL skills Experience with Google Cloud or similar Familiarity with unit testing , Git , and Agile Desirable: Experience with bioinformatics or scientific datasets Knowledge of Nextflow , Airflow , NLP , and Docker Exposure to AI/ML applications If the role aligns with your skills and experience, please apply with your updated CV More ❯
computer vision. Big Data Tools: Experience with big data platforms like Spark (PySpark) for handling large-scale datasets. MLOps: Familiarity with MLOps tools and concepts (e.g., Docker, Kubernetes, MLflow, Airflow) for model deployment and lifecycle management. Financial Domain Knowledge: Direct experience with at least two of the following domains: Credit Risk Modeling, Fraud Detection, Anti-Money Laundering (AML), Know More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
Knutsford, Cheshire, United Kingdom Hybrid / WFH Options
Experis
monitoring in cloud environments (AWS). Understanding of machine learning lifecycle and data pipelines. Proficiency with Python, Pyspark, Big-data ecosystems Hands-on experience with MLOps tools (e.g., MLflow, Airflow, Docker, Kubernetes) Secondary Skills Experience with RESTful APIs and integrating backend services All profiles will be reviewed against the required skills and experience. Due to the high number of More ❯
. Advanced SQL and dimensional data modelling skills (fact/dimension design, hierarchies, SCDs). Proven experience building ETL/ELT pipelines using tools such as SSIS , dbt , or Airflow . Solid understanding of database administration , tuning, and performance optimisation across MSSQL and PostgreSQL . Background in financial services or trading environments , with exposure to complex, high-volume datasets More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in More ❯
checks Understanding of agile software delivery and collaborative development Nice to Have: Experience with bioinformatics or large-scale biological data (e.g., genomics, proteomics) Familiarity with orchestration tools such as Airflow or Google Workflows Experience with containerisation (Docker) Exposure to NLP, unstructured data processing, or vector databases Knowledge of ML and AI-powered data products What You'll Bring Strong More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
optimized. YOUR BACKGROUND AND EXPERIENCE 5 years of commercial experience working as a Data Engineer 3 years exposure to the Azure Stack - Data bricks, Synapse, ADF Python and PySpark Airflow for Orchestration Test-Driven Development and Automated Testing ETL Development More ❯
Knutsford, Cheshire East, Cheshire, United Kingdom
Synapri
AWS Data/ML Engineering & ML Ops (ECS, Sagemaker) CI/CD pipelines (GitLab, Jenkins) Python, PySpark & Big Data ecosystems AI/ML lifecycle, deployment & monitoring MLOps tooling (MLflow, Airflow, Docker, Kubernetes) Front-end exposure (HTML, Flask, Streamlit) RESTful APIs & backend integration If this ML Engineer role is of interest, please apply now for immediate consideration. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
. Ability to collaborate effectively with senior engineers, data scientists, and architects. Proactive, detail-oriented, and eager to contribute within a greenfield project environment. Nice to Have: Experience with Airflow/Astro . Prior work with notebook-based development environments. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community More ❯
B testing Experiment design and hypothesis testing MLOps & Engineering Scalable ML systems (batch and real-time) ML pipelines, CI/CD, monitoring, deployment Familiarity with tools like MLflow, Kubeflow, Airflow, Docker, Kubernetes Strategic skills Align ML initiatives with business goals Prioritize projects based on ROI, feasibility, and risk Understand market trends and competitive ML strategies Communicate ML impact to More ❯
troubleshoot data workflows and performance issues Essential Skills & Experience: Proficiency in SQL , Python , or Scala Experience with cloud platforms such as AWS, Azure, or GCP Familiarity with tools like Apache Spark , Kafka , and Airflow Strong understanding of data modelling and architecture Knowledge of CI/CD pipelines and version control systems Additional Information: This role requires active SC More ❯
AWS) to join a contract till April 2026. Inside IR35 SC cleared Weekly travel to Newcastle Around £400 per day Contract till April 2026 Skills: - Python - AWS Services - Terraform - Apache Spark - Airflow - Docker More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯
and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines … on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have More ❯
Role:JAVA Full Stack Developer Job Type - Glasgow, Hybrid (3 5 days onsite) The Role As a JAVA Full Stack Lead Software Engineer with our client, you will play a crucial role in an agile team, focusing on the enhancement More ❯