fast-growing organisation. Key Responsibilities: Design, develop, and maintain scalable data pipelines using SQL and Python (PySpark) . Ingest, transform, and curate data from multiple sources into Azure Data Lake and DeltaLake formats. Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into … Skills & Experience: Strong proficiency in SQL for data transformation and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with DeltaLake and large-scale data processing. Experience building ETL More ❯
real interest in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising Azure-based data pipelines using Databricks, PySpark, ADF, and DeltaLake Implementing a medallion architecture - from raw to curated Collaborating with analysts to make data business-ready Applying CI/CD and DevOps best practices (Git, Azure DevOps More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. The role comes with an extensive benefits package including a good pension … role. KEYWORDS Lead Data Engineer, Senior Lead Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, On-Prem, Cloud, ETL, Azure Data Fabric, ADF, Databricks, Azure Data, DeltaLake, Data Lake. Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this position. More ❯
based data solutions using Databricks , Python , Spark , and Kafka -working on both greenfield initiatives and enhancing high-traffic financial applications. Key Skills & Experience: Strong hands-on experience with Databricks , DeltaLake , Spark Structured Streaming , and Unity Catalog Advanced Python/PySpark and big data pipeline development Familiar with event streaming tools ( Kafka , Azure Event Hubs ) Solid understanding of More ❯
have: Hands-on architecture and implementation experience with Microsoft Fabric (preferred), along with familiarity with Azure Synapse and Databricks. Experience in core data platform technologies and methods including Spark, DeltaLake, Medallion Architecture, pipelines, etc. Experience leading medium to large-scale cloud data platform implementations, guiding teams through technical challenges and ensuring alignment with best practices. Expertise in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
to-end, making meaningful contributions within a small, agile team. Experience We're looking for candidates with: Extensive experience in Data Engineering with a focus on Azure, Databricks, and Delta Lake. Proficiency in Kubernetes, Infrastructure as Code, and Terraform. Expertise in Azure DevOps and a commitment to best practices. A preference for simple, transparent solutions and a drive for More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Employment Type: Permanent
Salary: £10000 - £85000/annum 33 days holiday, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Hunter Selection
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Hunter Selection
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Employment Type: Permanent
Salary: £85000 - £100000/annum 33 days holiday, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
the live environment. What you'll be responsible for: Develop and maintain API services using Databricks and Azure. Implement and manage Azure Cache (Redis) and Azure Redis. Utilize Databricks Delta Live tables for data processing and analytics. Integrate the platform with Snowflake for data storage and retrieval. Collaborate with cross-functional teams to deliver the platform in an agile … deployment. What you'll need: Experience in ML Ops engineering, with a focus on Azure and Databricks. Knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience in Data (Delta) Lake Architecture. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Strong problem-solving More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
the business to build relationships and act as a multiplier. What we look for in you Understand, assess and effectively apply modern data architectures (dimensional model, data mesh, data lake). Experience in applying and using data observability methods effectively. Experience in modern software development practices (agile, CI/CD, DevOps, infrastructure as code, observability). Experience applying DORA … developing applications using most of the following: Strong knowledge of SQL and Python programming. Extensive experience working within a cloud environment. Experience with big data technologies (e.g. Spark, Databricks, DeltaLake, BigQuery). Experience with alternative data technologies (e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and … their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development; and give you great benefits to make your time at RVU even more enjoyable. Some of these benefits include: Employer matching pension up to 7.5%. Hybrid approach of in More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
user-friendly platforms and products. "Nice To Have": Exposure to Azure or GCP environments. Experience migrating to Kubernetes (especially from legacy orchestrators!) Data platform expertise, knowledge of lakehouse architectures, DeltaLake, or real-time data processing patterns. Experience working in hybrid environments with asynchronous teamwork. Tuning, profiling and optimising Kafka, Spark, or container networking under load. In Return More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
actions. Collaborate with cross-functional teams to deliver the platform in an agile manner. Provide guidance on the implementation and management of Azure Cache (Redis), Postgres, Azure Redis, Databricks Delta Live tables, and Snowflake. Ensure the platform supports microservices and API-driven architecture with sub-2-second calls. Develop and maintain documentation, architecture diagrams, and other technical artifacts. Manage … you'll need: Proven experience in ML Ops engineering, with a focus on Azure and Databricks. Strong knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Experience in Data (Delta) Lake Architecture (Azure). Excellent More ❯