key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem. Responsibilities: Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables. Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions. Drive best practices for data engineering. Help clients realise the potential of data science, machine … Support with planning, requirements refinements, and work estimation. Skills & Experiences: Proven experience designing and implementing data solutions in Azure using Databricks as a core platform. Hands-on expertise in DeltaLake, Delta Live Tables and Databricks Workflows. Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks. Deep understanding of lakehouse More ❯
spoke data architectures , optimising for performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access . Key Skills & Experience 5+ … years of experience in data architecture, solution architecture, or cloud data engineering roles. Strong expertise in Azure data services , including: Azure Databricks (DeltaLake, Unity Catalog, MLflow) Azure Event Hub and Azure Data Explorer for real-time and streaming data pipelines Azure Storage (ADLS Gen2, Blob Storage) for scalable data lakes Azure Purview or equivalent for data governance More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (DeltaLake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with workflow orchestration tools More ❯
Data Platform and Services, you'll not only maintain and optimize our data infrastructure but also spearhead its evolution. Built predominantly on Databricks, and utilizing technologies like Pyspark and DeltaLake, our infrastructure is designed for scalability, robustness, and efficiency. You'll take charge of developing sophisticated data integrations with various advertising platforms, empowering our teams with data … decision-making What you'll be doing for us Leadership in Design and Development : Lead in the architecture, development, and upkeep of our Databricks-based infrastructure, harnessing Pyspark and Delta Lake. CI/CD Pipeline Mastery : Create and manage CI/CD pipelines, ensuring automated deployments and system health monitoring. Advanced Data Integration : Develop sophisticated strategies for integrating data More ❯
conversations in the team and contribute to deep technical discussions Nice to Have Experience with operating machine learning models (e.g., MLFlow) Experience with Data Lakes, Lakehouses, and Warehouses (e.g., DeltaLake, Redshift) DevOps skills, including terraform and general CI/CD experience Previously worked in agile environments Experience with expert systems Perks & Benefits Comprehensive benefits package Fitness reimbursement Veeva Work-Anywhere More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. To be considered for this role you MUST have in-depth experience … role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, DeltaLake, Data Lake Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Employment Type: Permanent
Salary: £10000 - £85000/annum 33 days holiday, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Hunter Selection
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
the live environment. What you'll be responsible for: Develop and maintain API services using Databricks and Azure. Implement and manage Azure Cache (Redis) and Azure Redis. Utilize Databricks Delta Live tables for data processing and analytics. Integrate the platform with Snowflake for data storage and retrieval. Collaborate with cross-functional teams to deliver the platform in an agile … deployment. What you'll need: Experience in ML Ops engineering, with a focus on Azure and Databricks. Knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience in Data (Delta) Lake Architecture. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Strong problem-solving More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
user-friendly platforms and products. "Nice To Have": Exposure to Azure or GCP environments. Experience migrating to Kubernetes (especially from legacy orchestrators!) Data platform expertise, knowledge of lakehouse architectures, DeltaLake, or real-time data processing patterns. Experience working in hybrid environments with asynchronous teamwork. Tuning, profiling and optimising Kafka, Spark, or container networking under load. In Return More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
actions. Collaborate with cross-functional teams to deliver the platform in an agile manner. Provide guidance on the implementation and management of Azure Cache (Redis), Postgres, Azure Redis, Databricks Delta Live tables, and Snowflake. Ensure the platform supports microservices and API-driven architecture with sub-2-second calls. Develop and maintain documentation, architecture diagrams, and other technical artifacts. Manage … you'll need: Proven experience in ML Ops engineering, with a focus on Azure and Databricks. Strong knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Experience in Data (Delta) Lake Architecture (Azure). Excellent More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Eplass
Dataproc, Cloud Storage, Pub/Sub, Cloud Composer, and Data Catalog. Data Architecture: Strong background in designing and implementing data architectures, including lakehouse models and open table formats like DeltaLake or Iceberg. Programming Skills: Proficiency in Python, Scala, or Java, focusing on data processing and pipelines. Security: Deep understanding of data security practices, including IAM, RBAC, and More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Unolabs
being phased out by late 2025. This project involves migrating pipelines and platform infrastructure running on PVC to the Databricks Enterprise Edition (E2) using EC2-based compute, which enables DeltaLake, advanced security, and scalable architecture. The project also includes decomposing and migrating core components of a centralized automation framework, a multi-repository CI/CD system that … cloud-native environments. Strong experience with AWS services including EC2, IAM, ECR, S3, and Autoscaling. Deep knowledge of Databricks, including workspace setup, instance profiles, notebook CI/CD, and DeltaLake capabilities. Hands-on with Terraform for infrastructure provisioning and GitLab/Jenkins for pipeline orchestration. Experience with Docker, container image lifecycle, and secure CI/CD container More ❯