fast-growing organisation. Key Responsibilities: Design, develop, and maintain scalable data pipelines using SQL and Python (PySpark) . Ingest, transform, and curate data from multiple sources into Azure Data Lake and DeltaLake formats. Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into … Skills & Experience: Strong proficiency in SQL for data transformation and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with DeltaLake and large-scale data processing. Experience building ETL More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
first, modern data strategy within a collaborative and forward-thinking environment. Key Responsibilities: Design and develop end-to-end data pipelines (batch and streaming) using Azure Databricks, Spark, and Delta Lake. Implement the Medallion Architecture and ensure consistency across raw, enriched, and curated data layers. Build and optimise ETL/ELT processes using Azure Data Factory and PySpark. Enforce … stakeholders to ensure data quality and usability. Contribute to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, DeltaLake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control tools. More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks … various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as DeltaLake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake … for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
to-end, making meaningful contributions within a small, agile team. Experience We're looking for candidates with: Extensive experience in Data Engineering with a focus on Azure, Databricks, and Delta Lake. Proficiency in Kubernetes, Infrastructure as Code, and Terraform. Expertise in Azure DevOps and a commitment to best practices. A preference for simple, transparent solutions and a drive for More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. The role comes with an extensive benefits package including a good pension … role. KEYWORDS Lead Data Engineer, Senior Lead Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, On-Prem, Cloud, ETL, Azure Data Fabric, ADF, Databricks, Azure Data, DeltaLake, Data Lake. Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this position. More ❯
based data solutions using Databricks , Python , Spark , and Kafka -working on both greenfield initiatives and enhancing high-traffic financial applications. Key Skills & Experience: Strong hands-on experience with Databricks , DeltaLake , Spark Structured Streaming , and Unity Catalog Advanced Python/PySpark and big data pipeline development Familiar with event streaming tools ( Kafka , Azure Event Hubs ) Solid understanding of More ❯
real interest in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising Azure-based data pipelines using Databricks, PySpark, ADF, and DeltaLake Implementing a medallion architecture - from raw to curated Collaborating with analysts to make data business-ready Applying CI/CD and DevOps best practices (Git, Azure DevOps More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
user-friendly platforms and products. "Nice To Have": Exposure to Azure or GCP environments. Experience migrating to Kubernetes (especially from legacy orchestrators!) Data platform expertise, knowledge of lakehouse architectures, DeltaLake, or real-time data processing patterns. Experience working in hybrid environments with asynchronous teamwork. Tuning, profiling and optimising Kafka, Spark, or container networking under load. In Return More ❯
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Employment Type: Permanent
Salary: £10000 - £85000/annum 33 days holiday, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Hunter Selection
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Hunter Selection
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Employment Type: Permanent
Salary: £85000 - £100000/annum 33 days holiday, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
the live environment. What you'll be responsible for: Develop and maintain API services using Databricks and Azure. Implement and manage Azure Cache (Redis) and Azure Redis. Utilize Databricks Delta Live tables for data processing and analytics. Integrate the platform with Snowflake for data storage and retrieval. Collaborate with cross-functional teams to deliver the platform in an agile … deployment. What you'll need: Experience in ML Ops engineering, with a focus on Azure and Databricks. Knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience in Data (Delta) Lake Architecture. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Strong problem-solving More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
the business to build relationships and act as a multiplier. What we look for in you Understand, assess and effectively apply modern data architectures (dimensional model, data mesh, data lake). Experience in applying and using data observability methods effectively. Experience in modern software development practices (agile, CI/CD, DevOps, infrastructure as code, observability). Experience applying DORA … developing applications using most of the following: Strong knowledge of SQL and Python programming. Extensive experience working within a cloud environment. Experience with big data technologies (e.g. Spark, Databricks, DeltaLake, BigQuery). Experience with alternative data technologies (e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and … their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development; and give you great benefits to make your time at RVU even more enjoyable. Some of these benefits include: Employer matching pension up to 7.5%. Hybrid approach of in More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
actions. Collaborate with cross-functional teams to deliver the platform in an agile manner. Provide guidance on the implementation and management of Azure Cache (Redis), Postgres, Azure Redis, Databricks Delta Live tables, and Snowflake. Ensure the platform supports microservices and API-driven architecture with sub-2-second calls. Develop and maintain documentation, architecture diagrams, and other technical artifacts. Manage … you'll need: Proven experience in ML Ops engineering, with a focus on Azure and Databricks. Strong knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Experience in Data (Delta) Lake Architecture (Azure). Excellent More ❯
stakeholders. Expertise in designing and documenting data architectures (e.g., data warehouses, lakehouses, master/reference data models). Hands-on experience with Azure Databricks, including: Workspace and cluster configuration. DeltaLake table design and optimization. Integration with Unity Catalog for metadata management. Proficiency with Unity Catalog, including: Setting up data lineage and governance policies. Managing access controls and More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Unolabs
being phased out by late 2025. This project involves migrating pipelines and platform infrastructure running on PVC to the Databricks Enterprise Edition (E2) using EC2-based compute, which enables DeltaLake, advanced security, and scalable architecture. The project also includes decomposing and migrating core components of a centralized automation framework, a multi-repository CI/CD system that … cloud-native environments. Strong experience with AWS services including EC2, IAM, ECR, S3, and Autoscaling. Deep knowledge of Databricks, including workspace setup, instance profiles, notebook CI/CD, and DeltaLake capabilities. Hands-on with Terraform for infrastructure provisioning and GitLab/Jenkins for pipeline orchestration. Experience with Docker, container image lifecycle, and secure CI/CD container More ❯
Jenkins). Familiarity with large-scale data management and engineering best practices. Bonus Points For Workflow orchestration tools like Airflow. Working knowledge of Kafka and Kafka Connect. Experience with DeltaLake and lakehouse architectures. Proficiency in data serialization formats: JSON, XML, PARQUET, YAML. Cloud-based data services experience. Ready to build the future of data? If you're More ❯