6 of 6 Spark SQL Jobs in London

SC Cleared Senior Data Engineer

Hiring Organisation
Sanderson Government and Defence
Location
London, United Kingdom
Employment Type
Contract
Contract Rate
£500 - £540 per day
operate scalable data pipelines using Microsoft Fabric (OneLake/Delta Lake, Data Factory, Synapse Data Engineering). Develop batch processing solutions using PySpark, Spark SQL, Python, and SQL , with a focus on performance, resilience, and data quality. Support reporting and MI use cases, including … equivalent. Active SC clearance (BPSS minimum). Strong hands-on experience with Microsoft Fabric and Azure data services. Advanced skills in PySpark, Spark SQL, Python, and SQL . Experience delivering data engineering solutions in government or similarly regulated environments. CI/CD, DevOps ...

Data Engineer

Hiring Organisation
Chambers and Partners
Location
London, South East, England, United Kingdom
Employment Type
Full-Time
Salary
Competitive salary
combat travel-based issues and advice on expected interview topics/questions. Main Duties and Responsibilities Write clean and testable code using SQL and Python scripting languages, to enable our customer data products and business applications Build and manage data pipelines and notebooks, deploying code in a structured … Promote an innovative thinking process and encourage it in others. Working within the agile framework at Chambers Skills and Experience Strong proficiency in SQL , including Spark SQL and MS SQL Server , for querying, data manipulation, and performance optimization. Hands-on experience with ...

Lead Data Platform Engineer

Hiring Organisation
Harnham - Data & Analytics Recruitment
Location
London, South East, England, United Kingdom
Employment Type
Full-Time
Salary
£80,000 - £85,000 per annum
platform operations. * Architect and design end-to-end data platform solutions focused on scalability, security and reliability. * Lead delivery using Databricks, PySpark, Spark SQL and Azure Data Factory. * Drive DevOps best practices including CI/CD, automated testing and infrastructure as code. * Collaborate closely with data … Skills and Experience You will bring: * Strong commercial experience in Azure and Databricks, with deep knowledge of Lakehouse architecture. * Proficiency in PySpark, Python, SQL and Spark SQL. * Strong DevOps engineering skills including CI/CD, automated testing and deployment pipelines. * Experience with monitoring, logging and alerting ...

Data Engineer TV Advertising Data (FAST)

Hiring Organisation
Datatech Analytics
Location
London, United Kingdom
Employment Type
Permanent
Salary
£85,000
Experience 5+ years' experience working as a Data Engineer or in a similar role Proven experience with cloud-based data platforms (Azure, AWS, SQL, Snowflake, Springserv); Microsoft Fabric experience is a strong plus Strong proficiency in Spark SQL and PySpark , including complex transformations Experience ...

Databricks Engineer

Hiring Organisation
LUMORA SOLUTIONS
Location
London, South East, England, United Kingdom
Employment Type
Full-Time
Salary
£75,000 - £90,000 per annum
role, focused on building, optimising, and operating Databricks-based Lakehouse solutions across cloud environments. Responsibilities: Design, build, and maintain data pipelines using Databricks (Spark, Delta Lake). Develop batch and streaming workloads using PySpark/Spark SQL. Implement data ingestion, transformation, and orchestration pipelines. Optimise Databricks … Databricks with cloud-native services across Azure, AWS, or GCP. Skills: Strong hands-on expertise with Databricks in production environments. Advanced PySpark and Spark SQL development experience. Experience building lakehouse architectures using Delta Lake. Strong cloud experience in Azure (preferred), AWS, or GCP. Experience with orchestration ...

PySpark Developer

Hiring Organisation
Randstad Digital
Location
London, United Kingdom
Employment Type
Contract, Work From Home
Contract Rate
£300 - £350 per day
SAS2PY) and manual refactoring. Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS. Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness. Quality Assurance: Implement clean coding principles, modular design, and robust unit/comparative testing to ensure …/CD integration, and comprehensive technical documentation. Technical Requirements PySpark (P3): 5+ years of hands-on experience writing scalable, production-grade PySpark/Spark SQL. AWS Data Stack (P3): Strong proficiency in EMR, Glue, S3, Athena, and Glue Workflows. SAS Knowledge (P1): Solid foundation in SAS to enable ...