IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is More ❯
architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
CD. Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
Databricks Engineer London- hybrid- 3 days per week on-site 6 Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with … sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). ApacheAirflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault More ❯
note that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
computer vision. Big Data Tools: Experience with big data platforms like Spark (PySpark) for handling large-scale datasets. MLOps: Familiarity with MLOps tools and concepts (e.g., Docker, Kubernetes, MLflow, Airflow) for model deployment and lifecycle management. Financial Domain Knowledge: Direct experience with at least two of the following domains: Credit Risk Modeling, Fraud Detection, Anti-Money Laundering (AML), Know More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
Knutsford, Cheshire, United Kingdom Hybrid / WFH Options
Experis
monitoring in cloud environments (AWS). Understanding of machine learning lifecycle and data pipelines. Proficiency with Python, Pyspark, Big-data ecosystems Hands-on experience with MLOps tools (e.g., MLflow, Airflow, Docker, Kubernetes) Secondary Skills Experience with RESTful APIs and integrating backend services All profiles will be reviewed against the required skills and experience. Due to the high number of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in More ❯
checks Understanding of agile software delivery and collaborative development Nice to Have: Experience with bioinformatics or large-scale biological data (e.g., genomics, proteomics) Familiarity with orchestration tools such as Airflow or Google Workflows Experience with containerisation (Docker) Exposure to NLP, unstructured data processing, or vector databases Knowledge of ML and AI-powered data products What You'll Bring Strong More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Fynity
be empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS.You’ll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will require strong leadership skills More ❯
Knutsford, Cheshire East, Cheshire, United Kingdom
Synapri
AWS Data/ML Engineering & ML Ops (ECS, Sagemaker) CI/CD pipelines (GitLab, Jenkins) Python, PySpark & Big Data ecosystems AI/ML lifecycle, deployment & monitoring MLOps tooling (MLflow, Airflow, Docker, Kubernetes) Front-end exposure (HTML, Flask, Streamlit) RESTful APIs & backend integration If this ML Engineer role is of interest, please apply now for immediate consideration. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯