3 of 3 Remote/Hybrid Contract Spark SQL Jobs

SparkScala Developer

Hiring Organisation
Infoplus Technologies UK Ltd
Location
London, United Kingdom
Employment Type
Contract, Work From Home
Contract Rate
Up to £450 per day
British Citizenship, or ILR (Indefinite Leave to Remain), or UK Settlement Visa. We do not provide visa sponsorship. Key Requirements: Must have skills: -Spark & Scala Nice to have skills: -Spark Streaming, Hadoop, -Hive, -SQL, -Sqoop, - Impala Detailed Job Description: At least 8+ years … fundamental Data Structures and their usage At least 8+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies. Having expertise in Spark Core, Spark SQL and Spark Streaming. Experience with ...

Data Engineer - Databricks

Hiring Organisation
TXP Technology x People
Location
London, South East, England, United Kingdom
Employment Type
Contractor
Contract Rate
£0 per annum
data services across the organisation. Key Responsibilities Design, build, and maintain data pipelines using Databricks Develop ETL/ELT processes with PySpark and Spark SQL Transform and model structured and semi-structured datasets Improve performance, reliability, and cost efficiency of data workloads Ensure compliance with data … routes Required Skills & Experience Active SC Clearance - mandatory and non-negotiable Strong hands-on experience with Databricks in production environments Expert knowledge of Apache Spark , including PySpark and Spark SQL Proficiency with Python for data engineering Experience delivering solutions on cloud data platforms ...

PySpark Developer

Hiring Organisation
Randstad Digital
Location
London, United Kingdom
Employment Type
Contract, Work From Home
Contract Rate
£300 - £350 per day
SAS2PY) and manual refactoring. Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS. Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness. Quality Assurance: Implement clean coding principles, modular design, and robust unit/comparative testing to ensure …/CD integration, and comprehensive technical documentation. Technical Requirements PySpark (P3): 5+ years of hands-on experience writing scalable, production-grade PySpark/Spark SQL. AWS Data Stack (P3): Strong proficiency in EMR, Glue, S3, Athena, and Glue Workflows. SAS Knowledge (P1): Solid foundation in SAS to enable ...