3 of 3 Delta Lake Jobs in South Yorkshire

Data Engineer

Hiring Organisation
Quantum World Technologies Inc
Location
Sheffield, UK
Employment Type
Full-time
Lakehouse architecture on Databricks. You will play a key role in shaping our data strategy, ensuring scalability, performance, and governance while working with Delta Lake, Data Catalog, and PySpark. Key Responsibilities: Databricks Lakehouse Architecture: Design and implement scalable Databricks Lakehouse solutions with Delta Lake for optimized … analytics. CI/CD & Automation: Implement Databricks workflows and integrate with Azure/AWS/GCP data ecosystems. Primary Skills (Must-Have): Databricks – Architecture, Delta Lake, Lakehouse, Unity Catalog/Data Catalog PySpark (optimization, UDFs, Delta operations) SQL (advanced querying, performance tuning) Data Lake/Warehouse ...

Lead Data Engineer (Azure)

Hiring Organisation
OpenSource
Location
Sheffield, UK
Employment Type
Full-time
data lifecycle — from ingestion and transformation through to analytics and data product delivery. Architecting and operating pipelines using Databricks, Spark, and Delta Lake, ensuring performance, reliability, and cost-efficiency. Working closely with BI developers and analysts to deliver dashboards, extracts, datasets, and APIs that power customer insights. Shaping … Qualifications Experience leading or mentoring data engineering teams within a SaaS or product-led environment. Deep hands-on knowledge of Databricks, Apache Spark, and Delta Lake, including large-scale or near real-time workloads. Strong proficiency in Python, SQL, and cloud data services (Azure preferred, but any major ...

Senior Data Engineer

Hiring Organisation
Tenth Revolution Group
Location
Doncaster, South Yorkshire, UK
Employment Type
Full-time
Spark SQL notebooks for advanced data cleansing, transformation, and business logic Create production-ready semantic models for Power BI, including star schemas and Direct Lake optimisation Implement governance and security controls (RLS, OLS, service principals) Drive CI/CD best practices using Azure DevOps or GitHub Actions Engage with … clients through workshops, documentation, and technical knowledge transfer Required Experience Deep expertise in Microsoft Fabric (Lakehouse, Data Factory, Pipelines, Notebooks, Delta Lake, Semantic Models) Strong PySpark/Spark SQL skills for large-scale transformations Experience integrating with Dynamics 365, Dataverse, and Business Central Proven ability to implement ...