3 of 3 Contract Spark SQL Jobs in the UK

SC Cleared Senior Data Engineer

Hiring Organisation
Sanderson Government and Defence
Location
London, United Kingdom
Employment Type
Contract
Contract Rate
£500 - £540 per day
operate scalable data pipelines using Microsoft Fabric (OneLake/Delta Lake, Data Factory, Synapse Data Engineering). Develop batch processing solutions using PySpark, Spark SQL, Python, and SQL , with a focus on performance, resilience, and data quality. Support reporting and MI use cases, including … equivalent. Active SC clearance (BPSS minimum). Strong hands-on experience with Microsoft Fabric and Azure data services. Advanced skills in PySpark, Spark SQL, Python, and SQL . Experience delivering data engineering solutions in government or similarly regulated environments. CI/CD, DevOps ...

Fabric and Databricks Data Engineer - Outside IR35 - Hybrid

Hiring Organisation
Tenth Revolution Group
Location
Oxfordshire, England, United Kingdom
Employment Type
Contractor
Contract Rate
£500 - £600 per day
using Delta Lake principles Ingest, transform, and curate data from multiple sources (APIs, databases, files, streaming) Develop scalable data transformations using PySpark and Spark SQL Implement data models optimized for analytics and reporting (e.g. star schemas) Monitor, troubleshoot, and optimize performance and cost of data workloads … Experience Strong experience with Microsoft Fabric (Lakehouse, Pipelines, Notebooks, Dataflows, OneLake) Hands-on experience with Databricks in production environments Proficiency in PySpark and SQL Solid understanding of data engineering concepts (ETL/ELT, orchestration, partitioning) Experience working with Delta Lake Familiarity with cloud platforms (Azure preferred) Experience integrating ...

PySpark Developer

Hiring Organisation
Randstad Digital
Location
London, United Kingdom
Employment Type
Contract, Work From Home
Contract Rate
£300 - £350 per day
SAS2PY) and manual refactoring. Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS. Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness. Quality Assurance: Implement clean coding principles, modular design, and robust unit/comparative testing to ensure …/CD integration, and comprehensive technical documentation. Technical Requirements PySpark (P3): 5+ years of hands-on experience writing scalable, production-grade PySpark/Spark SQL. AWS Data Stack (P3): Strong proficiency in EMR, Glue, S3, Athena, and Glue Workflows. SAS Knowledge (P1): Solid foundation in SAS to enable ...