City of London, London, United Kingdom Hybrid / WFH Options
ECS
Azure services Requirements: 10+ years in cloud data engineering, with a strong focus on building scalable data pipelines Expertise in Azure Databricks, including building and managing ETL pipelines using PySpark or Scala Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure Data Lake Storage, Azure Data Factory, and Azure Synapse Analytics More ❯
Learning, Deep Learning or LLM Frameworks) Desirable Minimum 2 years experience in Data related field Minimum 2 years in a Business or Management Consulting field Experience of Docker, Hadoop, PySpark, Apache or MS Azure Minimum 2 years NHS/Healthcare experience Disclosure and Barring Service Check This post is subject to the Rehabilitation of Offenders Act (Exceptions Order More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
modelling techniques + data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg More ❯
EC4N 6JD, Vintry, United Kingdom Hybrid / WFH Options
Syntax Consultancy Ltd
modelling techniques + data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg More ❯
Knutsford, Cheshire, United Kingdom Hybrid / WFH Options
Experis
front-end development (HTML, Stream-lit, Flask Familiarity with model deployment and monitoring in cloud environments (AWS). Understanding of machine learning lifecycle and data pipelines. Proficiency with Python, Pyspark, Big-data ecosystems Hands-on experience with MLOps tools (e.g., MLflow, Airflow, Docker, Kubernetes) Secondary Skills Experience with RESTful APIs and integrating backend services All profiles will be reviewed More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
CDC. Knowledge of public/enterprise cloud technologies (AWS EC2, S3 Bucket, GCP, Azure) is advantageous but not required. Some skills/experience with automated testing frameworks (Java, Python, PySpark, Bitbucket, Gitlab, Jenkins) is advantageous but not required. Strong Environment Management skill Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Salt Search
EMEA to drive productivity and efficiency. Own sales operations functions including pipeline management, incentive compensation, deal desk, lead management, and contact centre operations . Use SQL and Python (Pandas, PySpark) to analyse data, automate workflows, and generate insights. Design and manage ETL/ELT processes, data models, and reporting automation . Leverage Databricks, Snowflake, and GCP to enable scalable More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
AWS/Azure - moving towards Azure). Collaborate with stakeholders and technical teams to deliver solutions that support business growth. Skills & Experience Required: Strong hands-on experience in Python, PySpark, SQL, Jupyter . Experience in Machine Learning engineering or data-focused development. Exposure to working in cloud platforms (AWS/Azure) . Ability to collaborate effectively with senior engineers More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
Team Leading experience - REQUIRED/Demonstrable on CV (Full Support from Engineering Manager is also available) Hands on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
Team Leading experience - REQUIRED/Demonstrable on CV (Full Support from Engineering Manager is also available) Hands on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We More ❯
Pyspark developer - 100% remote - £615 inside IR35 Exalto consulting are currently recruiting for a contract Pyspark developer this is an inside IR35 contract role paying £615 per day, 100% remote working and initial contract until end of 2025. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for … all IO, applying validation, enrichment, and billing calculation logic directly in PySpark code. Build reliable PySpark jobs that read/write Delta tables on ADLS Gen2. Integrate with orchestrators (ADF or Container App Orchestrator) and CI/CD pipelines (GitHub Actions) Operate securely within private-network Azure environments (Managed Identity, RBAC, Private Endpoints). PySpark with Delta … have the above experience and are looking for a new contract role please send your CV for immediate consideration as our client are looking for someone to hire ASAP Pyspark developer - 100% remote - £615 inside IR35 More ❯
Data Developer for an urgent contract assignment. Key Requirements: Proven background in AI and data development Strong proficiency in Python , including data-focused libraries such as Pandas, NumPy, and PySpark Hands-on experience with Apache Spark (PySpark preferred) Solid understanding of data management and processing pipelines Experience in algorithm development and graph data structures is advantageous Active SC More ❯
Pyspark developer (CTC Clearance) - 100% remote - £615 inside IR35 Exalto consulting are currently recruiting for a contract Pyspark developer this is an inside IR35 contract role paying £615 per day, 100% remote working and initial contract until end of 2025, must be CTC cleared. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet … data and use Delta Lake for all IO, applying validation, enrichment, and billing calculation logic directly in PySpark code. Build reliable PySpark jobs that read/write Delta tables on ADLS Gen2. Integrate with orchestrators (ADF or Container App Orchestrator) and CI/CD pipelines (GitHub Actions) Operate securely within private-network Azure environments (Managed Identity, RBAC, Private … Endpoints). PySpark with Delta Lake (structured APIs, MERGE, schema evolution). Solid knowledge of Azure Synapse Spark pools or Databricks, ADLS Gen2, and Azure SQL. Familiarity with ADF orchestration and containerised Spark workloads. If you have the above experience and are looking for a new contract role please send your CV for immediate consideration as our client are More ❯
with a focus on performance, scalability, and reliability. Responsibilities Design and implement robust data migration pipelines using Azure Data Factory, Synapse Analytics, and Databricks Develop scalable ETL processes using PySpark and Python Collaborate with stakeholders to understand legacy data structures and ensure accurate mapping and transformation Ensure data quality, governance, and performance throughout the migration lifecycle Document technical processes … and support knowledge transfer to internal teams Required Skills Strong hands-on experience with Azure Data Factory, Synapse, Databricks, PySpark, Python, and SQL Proven track record in delivering data migration projects within Azure environments Ability to work independently and communicate effectively with technical and non-technical stakeholders Previous experience in consultancy or client-facing roles is advantageous More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with a focus on performance, scalability, and reliability. Responsibilities Design and implement robust data migration pipelines using Azure Data Factory, Synapse Analytics, and Databricks Develop scalable ETL processes using PySpark and Python Collaborate with stakeholders to understand legacy data structures and ensure accurate mapping and transformation Ensure data quality, governance, and performance throughout the migration lifecycle Document technical processes … and support knowledge transfer to internal teams Required Skills Strong hands-on experience with Azure Data Factory, Synapse, Databricks, PySpark, Python, and SQL Proven track record in delivering data migration projects within Azure environments Ability to work independently and communicate effectively with technical and non-technical stakeholders Previous experience in consultancy or client-facing roles is advantageous More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Oliver James
Proven experience working as a principal or lead data engineer * Strong background working with large datasets, with proficiency in SQL, Python, and PySpark * Experience managing and mentoring engineers with varying levels of experience I'm currently working with a leading insurance broker who is looking to hire a Lead Azure Data Engineer on an initial 12-month fixed-term … an Azure-based data lakehouse. Key requirements: * Proven experience working as a principal or lead data engineer * Strong background working with large datasets, with proficiency in SQL, Python, and PySpark * Experience managing and mentoring engineers with varying levels of experience * Hands-on experience deploying pipelines within Azure Databricks, ideally following the Medallion Architecture framework Hybrid working: Minimum two days More ❯
Pyspark developer - 100% remote - £615 inside IR35 Exalto consulting are currently recruiting for a contract Pyspark developer this is an inside IR35 contract role paying £615 per day, 100% remote working and initial contract until end of 2025. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for More ❯