challenges. What you'll need: Solid hands-on experience with Python and SQL Strong knowledge of data pipelines, cloud infrastructure (AWS & Azure), and integration tools Familiarity with tools like Airflow and version control, GIT. Confident working with structured and unstructured datasets Comfortable contributing to both technical delivery and collaborative problem-solving Salary and Benefits: This role is paying a More ❯
london, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
challenges. What you'll need: Solid hands-on experience with Python and SQL Strong knowledge of data pipelines, cloud infrastructure (AWS & Azure), and integration tools Familiarity with tools like Airflow and version control, GIT. Confident working with structured and unstructured datasets Comfortable contributing to both technical delivery and collaborative problem-solving Salary and Benefits: This role is paying a More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Peaple Talent
challenges. What you'll need: Solid hands-on experience with Python and SQL Strong knowledge of data pipelines, cloud infrastructure (AWS & Azure), and integration tools Familiarity with tools like Airflow and version control, GIT. Confident working with structured and unstructured datasets Comfortable contributing to both technical delivery and collaborative problem-solving Salary and Benefits: This role is paying a More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
challenges. What you'll need: Solid hands-on experience with Python and SQL Strong knowledge of data pipelines, cloud infrastructure (AWS & Azure), and integration tools Familiarity with tools like Airflow and version control, GIT. Confident working with structured and unstructured datasets Comfortable contributing to both technical delivery and collaborative problem-solving Salary and Benefits: This role is paying a More ❯
. Advanced SQL and dimensional data modelling skills (fact/dimension design, hierarchies, SCDs). Proven experience building ETL/ELT pipelines using tools such as SSIS , dbt , or Airflow . Solid understanding of database administration , tuning, and performance optimisation across MSSQL and PostgreSQL . Key Responsibilities: Design and maintain data models that meet business requirements, ensuring scalability, consistency More ❯
. Advanced SQL and dimensional data modelling skills (fact/dimension design, hierarchies, SCDs). Proven experience building ETL/ELT pipelines using tools such as SSIS , dbt , or Airflow . Solid understanding of database administration , tuning, and performance optimisation across MSSQL and PostgreSQL . Key Responsibilities: Design and maintain data models that meet business requirements, ensuring scalability, consistency More ❯
empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS. You'll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will require strong leadership skills More ❯
empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS. You’ll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will require strong leadership skills More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fynity
empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS. You’ll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will require strong leadership skills More ❯
london, south east england, united kingdom Hybrid / WFH Options
Fynity
be empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS. Youll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will require strong leadership skills More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
london (city of london), south east england, united kingdom
Harnham
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS. You'll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will require strong leadership skills More ❯
City of London, London, United Kingdom Hybrid / WFH Options
X4 Technology
Data Engineer (Databricks) Deep hands-on experience with Databricks, Delta Live Tables (DLT), data modelling, data ingestion & integration Strong proficiency across Data Factory, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, GITHUB/DevOps tools, testing frameworks (SonarQube/PyTest) Proven experience leading data engineering teams while remaining actively hands-on involved in delivery Background within energy, utilities More ❯
Data Engineer (Databricks) Deep hands-on experience with Databricks, Delta Live Tables (DLT), data modelling, data ingestion & integration Strong proficiency across Data Factory, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, GITHUB/DevOps tools, testing frameworks (SonarQube/PyTest) Proven experience leading data engineering teams while remaining actively hands-on involved in delivery Background within energy, utilities More ❯
Data Engineer (Databricks) Deep hands-on experience with Databricks, Delta Live Tables (DLT), data modelling, data ingestion & integration Strong proficiency across Data Factory, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, GITHUB/DevOps tools, testing frameworks (SonarQube/PyTest) Proven experience leading data engineering teams while remaining actively hands-on involved in delivery Background within energy, utilities More ❯
london, south east england, united kingdom Hybrid / WFH Options
X4 Technology
Data Engineer (Databricks) Deep hands-on experience with Databricks, Delta Live Tables (DLT), data modelling, data ingestion & integration Strong proficiency across Data Factory, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, GITHUB/DevOps tools, testing frameworks (SonarQube/PyTest) Proven experience leading data engineering teams while remaining actively hands-on involved in delivery Background within energy, utilities More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
X4 Technology
Data Engineer (Databricks) Deep hands-on experience with Databricks, Delta Live Tables (DLT), data modelling, data ingestion & integration Strong proficiency across Data Factory, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, GITHUB/DevOps tools, testing frameworks (SonarQube/PyTest) Proven experience leading data engineering teams while remaining actively hands-on involved in delivery Background within energy, utilities More ❯
slough, south east england, united kingdom Hybrid / WFH Options
X4 Technology
Data Engineer (Databricks) Deep hands-on experience with Databricks, Delta Live Tables (DLT), data modelling, data ingestion & integration Strong proficiency across Data Factory, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, GITHUB/DevOps tools, testing frameworks (SonarQube/PyTest) Proven experience leading data engineering teams while remaining actively hands-on involved in delivery Background within energy, utilities More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Searchability
TO HAVE: Experience integrating AI models into production systems using GCP, AWS, or Azure. Familiarity with vector databases, embedding models, or retrieval-augmented generation (RAG). Knowledge of Docker, Airflow, or MLOps pipelines. Strong understanding of AI ethics, data privacy, and responsible model deployment. TO BE CONSIDERED... Please either apply online or email me directly at .By applying for More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Interquest
analysts, and client stakeholders to deliver reliable, automated, and high-performing data solutions end to end. What We’re Looking For Strong experience with Python, Databricks and tools like Airflow Confident working across Cloud Platforms (AWS, Azure, GCP) Great communication skills and the ability to work with both technical and non-technical teams Comfortable in a consultancy setting, balancing More ❯