Hadoop Developer (Spark/Scala)
- Hiring Organisation
- Capgemini
- Location
- City of London, London, United Kingdom
About the Job you are considering: This role involves designing, building, and supporting large‐scale Big Data solutions using Hadoop and Spark technologies. You will primarily develop and debug Spark jobs in Scala, with opportunities to use Java and Python where appropriate. The position focuses on creating scalable … data pipelines, optimizing distributed processing, and supporting analytics and data science teams. It also offers exposure to cloud‐based Big Data platforms and modern data engineering practices in a fast‐paced, data‐driven environment. Hybrid working: The places that you work from ...