data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code reviews, and More ❯
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
london, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Peaple Talent
having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional More ❯
is optimized. YOUR BACKGROUND AND EXPERIENCE 5 years of commercial experience working as a Data Engineer 3 years exposure to the Azure Stack - Data bricks, Synapse, ADF Python and PySpark Airflow for Orchestration Test-Driven Development and Automated Testing ETL Development More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
modelling techniques + data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg More ❯
Atherstone, Warwickshire, West Midlands, United Kingdom Hybrid / WFH Options
Aldi Stores
end-to-end ownership of demand delivery Provide technical guidance for team members Providing 2nd or 3rd level technical support About You Experience using SQL, SQL Server DB, Python & PySpark Experience using Azure Data Factory Experience using Data Bricks and Cloudsmith Data Warehousing Experience Project Management Experience The ability to interact with the operational business and other departments, translating More ❯
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
Enforce GDPR-compliant governance and security Optimize performance and cost of data workflows Collaborate with teams to deliver clean, structured data Key Skills Required: Azure data services, Python/PySpark/SQL, data modelling Power BI (preferred), legal system familiarity (bonus) Strong grasp of UK data regulations Certifications: Microsoft certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect) desirable More ❯
Knutsford, Cheshire, United Kingdom Hybrid / WFH Options
Experis
front-end development (HTML, Stream-lit, Flask Familiarity with model deployment and monitoring in cloud environments (AWS). Understanding of machine learning lifecycle and data pipelines. Proficiency with Python, Pyspark, Big-data ecosystems Hands-on experience with MLOps tools (e.g., MLflow, Airflow, Docker, Kubernetes) Secondary Skills Experience with RESTful APIs and integrating backend services All profiles will be reviewed More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
DXC Technology
and addressing data science opportunities. Required Skills & Experience Proven experience in MLOps or DevOps roles within machine learning environments Strong programming skills in Python, with hands-on experience in PySpark and SQL Deep understanding of ML lifecycle management and CI/CD best practices Familiarity with cloud-native ML platforms and scalable deployment strategies Excellent problem-solving skills and More ❯
Essential Skills Include: Proven leadership and mentoring experience in senior data engineering roles Expertise in Azure Data Factory, Azure Databricks, and lakehouse architecture Strong programming skills (Python, T-SQL, PySpark) and test-driven development Deep understanding of data security, compliance, and tools like Microsoft Purview Excellent communication and stakeholder management skills Experience with containerisation and orchestration (e.g., Kubernetes, Azure More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
CDC. Knowledge of public/enterprise cloud technologies (AWS EC2, S3 Bucket, GCP, Azure) is advantageous but not required. Some skills/experience with automated testing frameworks (Java, Python, PySpark, Bitbucket, Gitlab, Jenkins) is advantageous but not required. Strong Environment Management skill Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to More ❯
and development plan beyond generic certifications. Provide a Rough Order of Magnitude (ROM) cost for implementing the proposed roadmap. Essential Deep expertise in the Databricks Lakehouse Platform, including Python, PySpark, and advanced SQL. Strong practical knowledge of Microsoft Fabric. Proven experience in senior, client-facing roles with a consultancy mindset. Background in technical coaching, mentorship, or skills assessment. Excellent More ❯
and development plan beyond generic certifications. Provide a Rough Order of Magnitude (ROM) cost for implementing the proposed roadmap. Essential: Deep expertise in the Databricks Lakehouse Platform, including Python, PySpark, and advanced SQL. Strong practical knowledge of Microsoft Fabric. Proven experience in senior, client-facing roles with a consultancy mindset. Background in technical coaching, mentorship, or skills assessment. Excellent More ❯
testing, including ETL, data warehouse, and reporting validation. Strong hands-on knowledge of AWS data tools (Glue, Redshift, Athena, EMR, Lambda). Confident with Power BI, SQL, Python/PySpark, and QA automation tools. Solid grasp of data governance, GDPR, and data quality standards. Background in Agile delivery and experience leading or mentoring QA teams. Strong analytical mindset with More ❯
watford, hertfordshire, east anglia, united kingdom
Addition+
testing, including ETL, data warehouse, and reporting validation. Strong hands-on knowledge of AWS data tools (Glue, Redshift, Athena, EMR, Lambda). Confident with Power BI, SQL, Python/PySpark, and QA automation tools. Solid grasp of data governance, GDPR, and data quality standards. Background in Agile delivery and experience leading or mentoring QA teams. Strong analytical mindset with More ❯
testing, including ETL, data warehouse, and reporting validation. Strong hands-on knowledge of AWS data tools (Glue, Redshift, Athena, EMR, Lambda). Confident with Power BI, SQL, Python/PySpark, and QA automation tools. Solid grasp of data governance, GDPR, and data quality standards. Background in Agile delivery and experience leading or mentoring QA teams. Strong analytical mindset with More ❯