Remote Delta Lake Jobs in the UK

14 of 14 Remote Delta Lake Jobs in the UK

Technical Data Architect

London, United Kingdom
Hybrid / WFH Options
Espire Infolabs Limited
Key Responsibilities Lead the design, development, and maintenance of scalable, high-performance data pipelines on Databricks. Architect and implement data ingestion, transformation, and integration workflows using PySpark, SQL, and Delta Lake. Guide the team in migrating legacy ETL processes to modern cloud-based data pipelines. Ensure data accuracy, schema consistency, row counts, and KPIs during migration and transformation. Collaborate … and analytics. ________________________________________ Required Skills & Qualifications 10-12 years of experience in data engineering, with at least 3+ years in a technical lead role. Strong expertise in Databricks , PySpark , and Delta Lake . DBT Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling. Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms (AWS … of data warehousing , transformation logic , SLAs, and dependencies. Hands-on experience with real-time streaming near-realtime batch is a plus., optimisation of data bricks and DBT workload and Delta Lake Familiarity with CI/CD pipelines, DevOps practices, and Git-based workflows. Knowledge of data security, encryption, and compliance frameworks (GDPR, SOC2, ISO ).good to have Excellent More ❯
Employment Type: Permanent, Work From Home
Posted:

Data Engineer – Azure / Databricks – Birmingham / Solihull

Birmingham, West Midlands, England, United Kingdom
Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Data Consultant/Lakehouse/Delta Lake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £ + car/allowance (£5,000) + 15% bonus. One of our leading clients is looking to recruit … role Salary £ + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory Desirable: Exposure ML/Machine Learning/AI/Artificial Intelligence More ❯
Employment Type: Full-Time
Salary: £60,000 - £70,000 per annum
Posted:

Pyspark developer - 100% remote - £615 inside IR35

London, United Kingdom
Hybrid / WFH Options
Exalto Consulting ltd
per day, 100% remote working and initial contract until end of 2025. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for all IO, applying validation, enrichment, and billing calculation logic directly in PySpark code. Build reliable PySpark jobs that read/write Delta tables on ADLS Gen2. … with orchestrators (ADF or Container App Orchestrator) and CI/CD pipelines (GitHub Actions) Operate securely within private-network Azure environments (Managed Identity, RBAC, Private Endpoints). PySpark with Delta Lake (structured APIs, MERGE, schema evolution). Solid knowledge of Azure Synapse Spark pools or Databricks, ADLS Gen2, and Azure SQL. Familiarity with ADF orchestration and containerised Spark More ❯
Employment Type: Contract, Work From Home
Posted:

Pyspark developer (CTC Clearance) - 100% remote - £615 inside IR35

United Kingdom
Hybrid / WFH Options
Exalto Consulting
remote working and initial contract until end of 2025, must be CTC cleared. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for all IO, applying validation, enrichment, and billing calculation logic directly in PySpark code. Build reliable PySpark jobs that read/write Delta tables on ADLS … with orchestrators (ADF or Container App Orchestrator) and CI/CD pipelines (GitHub Actions) Operate securely within private-network Azure environments (Managed Identity, RBAC, Private Endpoints). PySpark with Delta Lake (structured APIs, MERGE, schema evolution). Solid knowledge of Azure Synapse Spark pools or Databricks, ADLS Gen2, and Azure SQL. Familiarity with ADF orchestration and containerised Spark More ❯
Employment Type: Contract
Rate: GBP Daily
Posted:

Data Engineer - Databricks

City of London, London, United Kingdom
Hybrid / WFH Options
ECS
engineering, with a strong focus on building scalable data pipelines Expertise in Azure Databricks, including building and managing ETL pipelines using PySpark or Scala Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure Data Lake Storage, Azure Data Factory, and Azure Synapse Analytics Proficiency in SQL and Python for More ❯
Employment Type: Contract, Work From Home
Rate: £600 - £650 per day
Posted:

Data Engineer (Must have Databricks!)

London Area, United Kingdom
Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
Posted:

Data Engineer (Must have Databricks!)

City of London, London, United Kingdom
Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
Posted:

Data Engineer (Must have Databricks!)

london, south east england, united kingdom
Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
Posted:

Data Engineer (Must have Databricks!)

slough, south east england, united kingdom
Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
Posted:

Data Engineer (Must have Databricks!)

london (city of london), south east england, united kingdom
Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
Posted:

Data Engineer (Must have Databricks!)

United Kingdom
Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You'll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
Employment Type: Permanent, Work From Home
Posted:

AWS Data Engineer

City of London, London, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., Delta Lake, Hudi) Exposure to CI/CD tools like GitHub Actions, GitLab CI, or Jenkins Familiarity with data quality frameworks such as Great Expectations or Deequ Interest in More ❯
Employment Type: Permanent
Salary: £85000 - £95000/annum
Posted:

Senior DevOps/Platform Engineer

Herefordshire, West Midlands, United Kingdom
Hybrid / WFH Options
IO Associates
and maintain platform software, libraries, and dependencies . Set up and manage Spark clusters , including migrations to new platforms. Manage user accounts and permissions across identity platforms. Maintain the Delta Lake and ensure platform-wide security standards. Collaborate with the wider team to advise on system design and delivery . What we're looking for: Strong Linux engineering More ❯
Employment Type: Permanent
Posted:

Pyspark developer - 100% remote - £615 inside IR35

London, United Kingdom
Hybrid / WFH Options
Exalto Consulting ltd
per day, 100% remote working and initial contract until end of 2025. Key Duties and Responsibilities Develop and optimise PySpark batch pipelines that process Parquet data and use Delta Lake for all IO, applying validation, click apply for full job details More ❯
Employment Type: Contract
Rate: GBP 130,446 Annual
Posted:
Delta Lake
10th Percentile
£55,000
25th Percentile
£58,125
Median
£65,000
75th Percentile
£88,250
90th Percentile
£94,750