4 of 4 Contract PySpark Jobs in Central London

Data Engineer

Hiring Organisation
Experis
Location
City of London, London, United Kingdom
Employment Type
Contract
Contract Rate
£500 - £510/day
basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions. Responsibilities: Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires ...

Data Analyst

Hiring Organisation
Experis
Location
City of London, London, United Kingdom
Employment Type
Contract
Contract Rate
£500 - £510/day
basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions. Responsibilities: Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires ...

Azure -Data Solution Designer

Hiring Organisation
DCV Technologies Limited
Location
Central London, London, United Kingdom
Employment Type
Contract
Solid understanding of data architecture principles (data modelling, lineage, metadata) Experience designing enterprise data platforms and frameworks Hands-on experience with Python and Spark (PySpark preferred) Experience working in Agile environments with distributed teams Strong problem-solving and solution design capabilities Excellent communication and stakeholder management skills Nice ...

Lead PySpark Engineer

Hiring Organisation
Randstad Technologies Recruitment
Location
City of London, London, United Kingdom
Employment Type
Contract
Contract Rate
£281 - £292/day 388
PySpark Engineer Lead As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn't just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines … Tier-1 financial services environment. Core Responsibilities Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue. Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY and manual refactoring. Performance Tuning: Optimise Spark ...