Java / Pyspark / Python Developer

Location: UK (Hybrid / Remote within UK)

Type: Contract / Full-TIme

Role Overview:

Seeking an experienced developer with strong Java, PySpark, and Python skills to design, build, and optimize data-driven applications and batch/streaming pipelines.

Key Responsibilities:

• Design, develop, and maintain backend services and applications using Java and Python.

• Build and optimize scalable data processing pipelines using PySpark (batch and/or streaming).

• Implement robust ETL/ELT workflows to ingest, transform, and validate large datasets.

• Write clean, testable, and efficient code; perform unit/integration testing and support CI/CD practices.

• Monitor, troubleshoot, and improve performance of applications and Spark jobs.

• Collaborate with data engineers, analysts, and product teams to translate requirements into technical solutions.

• Contribute to code reviews, technical documentation, and adherence to coding standards.

Required skills and experience:

• Strong programming skills in Java (e.g., collections, multithreading, exception handling, REST APIs).

• Hands-on experience with PySpark for distributed data processing and performance tuning.

• Solid Python development experience, including working with common libraries for data processing and scripting.

• Good understanding of relational databases and SQL; familiarity with schema design and query optimization.

• Experience with version control (Git) and working in Agile environments.

• Strong problem-solving skills and the ability to debug complex data and performance issues.

Nice-to-have:

• Experience with big data ecosystems (e.g., Hadoop, Kafka, cloud data platforms).

• Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes).

• Exposure to data warehousing, BI, or machine learning workloads.

Job Details

Company
UNITECH
Location
London, UK
Hybrid / Remote Options
Employment Type
Full-time
Posted