class Lakehouse from scratch-this is the one. What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and SparkSQL across a full Medallion Architecture (Bronze - Silver - Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and … Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews … across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, SparkSQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD More ❯
class Lakehouse from scratch-this is the one. ? What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and SparkSQL across a full Medallion Architecture (Bronze ? Silver ? Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and … Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev ? Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews … across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. ? Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, SparkSQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD More ❯
class Lakehouse from scratch-this is the one. What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and SparkSQL across a full Medallion Architecture (Bronze Silver Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and … Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews … across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, SparkSQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, SparkSQL, and related services. Apply best practices for development, deployment, and workload optimization. Program in SQL, Python, R, YAML, and JavaScript. Integrate data from relational databases, APIs, and streaming sources using best-practice patterns. Collaborate with API developers for seamless data exchange. Utilize Azure Purview for governance and quality monitoring. Implement lineage tracking, metadata management, and compliance More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
build, and maintain scalable ETL pipelines to ingest, transform, and load data from diverse sources (APIs, databases, files) into Azure Databricks. Implement data cleaning, validation, and enrichment using Spark (PySpark/Scala) and related tools to ensure quality and consistency. Utilize Unity Catalog, Delta Lake, SparkSQL, and best practices for Databricks development, optimization … and deployment. Program in SQL, Python, R, YAML, and JavaScript. Integrate data from multiple sources and formats (CSV, JSON, Parquet, Delta) for downstream analytics, dashboards, and reporting. Apply Azure Purview for governance and quality checks. Monitor pipelines, resolve issues, and enhance data quality processes. Work closely with engineers, data scientists, and stakeholders. Participate in code reviews and clearly More ❯
architecture designs into actionable build plans and lead the development of data processing workflows. From a technical standpoint you must possess: Extensive Databricks experience including Unity Catolog and SparkSQL Strong programming skills, preferably in Python and SQL Strong knowledge and experience in Azure, including working with Azure Data Factory and Azure Storage Accounts More ❯
london, south east england, united kingdom Hybrid/Remote Options
UBDS Group
modelling expertise to develop low-level designs and implement models against business requirements, using design patterns such as Inmon, Kimball and Data Vault. Excellent Databricks, Python, PySpark and SparkSQL knowledge, including writing, testing and quality assuring code and knowledge of Unity Catalog best practice to govern data assets as well as Azure and/or More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
influence decisions.Strong understanding of data models and analytics; exposure to predictive modelling and machine learning is a plus.Proficient in SQL and Python, with bonus points for PySpark, SparkSQL, and Git.Skilled in data visualisation with tools such as Tableau or Power BI.Confident writing efficient code and troubleshooting sophisticated queries.Clear and adaptable communicator, able to explain technical topics to business More ❯
Reigate, Surrey, England, United Kingdom Hybrid/Remote Options
esure Group
Strong understanding of data models and analytics; exposure to predictive modelling and machine learning is a plus. Proficient in SQL and Python, with bonus points for PySpark, SparkSQL, and Git. Skilled in data visualisation with tools such as Tableau or Power BI. Confident writing efficient code and troubleshooting sophisticated queries. Clear and adaptable communicator, able to explain technical More ❯