In order to be successful, you will have the following experience: Extensive AI & Data Development background Experiences with Python (including data libraries such as Pandas, NumPy, and PySpark) and ApacheSpark (PySpark preferred) Strong experience with data management and processing pipelines Algorithm development and knowledge of graphs will be beneficial SC Clearance is essential Within this role, you … will be responsible for: Supporting the development and delivery of AI solution to a Government customer Design, develop, and maintain data processing pipelines using ApacheSpark Implement ETL/ELT workflows to extract, transform and load large-scale datasets efficiently Develop and optimize Python-based applications for data ingestion Collaborate on development of machine learning models Ensure data … to the design of data architectures, storage strategies, and processing frameworks Work with cloud data platforms (e.g., AWS, Azure, or GCP) to deploy scalable solutions Monitor, troubleshoot, and optimize Spark jobs for performance and cost efficiency Liaise with customer and internal stakeholders on a regular basis This represents an excellent opportunity to secure a long term contract, within a More ❯
Bromley, Kent, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
experience with AWS data platforms and related services. Solid grasp of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies such as Spark and Hadoop, and distributed computing concepts. Proficiency in SQL and at least one programming language (e.g., Python, Java). Preferred Qualifications: Relevant certifications in data architecture, cloud platforms, or More ❯
scripting (Python, Bash) and programming (Java). Hands-on experience with DevOps tools : GitLab, Ansible, Prometheus, Grafana, Nagios, Argo CD, Rancher, Harbour. Deep understanding of big data technologies : Hadoop, Spark, and NoSQL databases. Nice to Have Familiarity with agile methodologies (Scrum or Kanban). Strong problem-solving skills and a collaborative working style. Excellent communication skills , with the ability More ❯
with AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a More ❯
ensure data integrity and reliability. Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline for Data Orchestration, Databricks, or Apache Spark. Support the integration of data into visualisation platforms (e.g. Power BI, ServiceNow) and other analytical environments. Ensure compliance with data governance, security, and privacy policies. Document data architecture More ❯
and ETL/ELT processes. Proficiency in AWS data platforms and services. Solid understanding of data governance principles (data quality, metadata, access control). Familiarity with big data technologies (Spark, Hadoop) and distributed computing. Advanced SQL skills and proficiency in at least one programming language (Python, Java). Additional Requirements Immediate availability for an October start. Must be UK More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
advantagioous Good communication skills Experience in Python and/or Java development Experience in git and basic Unix Commands You may also have Experience with large data processing technologies (ApacheSpark) Other helpful information Hybrid Working: At EMBL-EBI we are pleased to offer hybrid working options for all our employees. Our team work at least two days More ❯
AWS) to join a contract till April 2026. Inside IR35 SC cleared Weekly travel to Newcastle Around £400 per day Contract till April 2026 Skills: - Python - AWS Services - Terraform - ApacheSpark - Airflow - Docker More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- ApacheSpark- Airflow- Docker More ❯
be on designing and maintaining the data pipelines that feed large-scale ML and research workflows. Day-to-day responsibilities include: Building and maintaining data pipelines using Python, SQL, Spark, and Google Cloud technologies (BigQuery, Cloud Storage). Ensuring pipelines are robust, reliable, and optimised for AI/ML use cases. Developing automated tests, documentation, and monitoring for production … best practices, and continuously improving performance and quality. Tech Stack & Skills Core Skills: Strong experience with Python and SQL in production environments Proven track record developing data pipelines using Spark, BigQuery, and cloud tools (preferably Google Cloud) Familiarity with CI/CD and version control (git, GitHub, DevOps workflows) Experience with unit testing (e.g., pytest) and automated quality checks More ❯