Data Engineer DV Cleared
Data Engineer Opportunity, DV-cleared only All candidates should make sure to read the following job description and information carefully before applying.
London Manchester Bristol
A progressive, leading-edge UK consulting and technology organisation is hiring Data Engineers to deliver mission-critical work across defence and security programmes, building modern data platforms and production-grade pipelines that enable better decisions at pace. Active DV clearance is essential, we are seeking DV cleared candidates only.
The role
You'll design and deploy production-grade data pipelines, from ingestion through to consumption, within a modern big data architecture. Work is delivery focused and delivered using agile engineering practices.
Typical responsibilities
Build and operate robust pipelines across ingestion, processing, and consumption
Use scripting, APIs, and SQL to extract, transform, and curate data
Process large structured and unstructured datasets, integrating multiple sources
Collaborate with stakeholders and delivery teams to drive outcomes
Core skills (indicative)
Production pipeline design and deployment experience
Strong engineering capability with Python, SQL, plus big data tooling (e.g., Spark, and Java/Scala where relevant
AWS, Azure, GCP
Working pattern
Hybrid working, with the team on client site or in the office a minimum of two days per week. xkybehq Actual time and location will vary by role or assignment.
London Manchester Bristol
A progressive, leading-edge UK consulting and technology organisation is hiring Data Engineers to deliver mission-critical work across defence and security programmes, building modern data platforms and production-grade pipelines that enable better decisions at pace. Active DV clearance is essential, we are seeking DV cleared candidates only.
The role
You'll design and deploy production-grade data pipelines, from ingestion through to consumption, within a modern big data architecture. Work is delivery focused and delivered using agile engineering practices.
Typical responsibilities
Build and operate robust pipelines across ingestion, processing, and consumption
Use scripting, APIs, and SQL to extract, transform, and curate data
Process large structured and unstructured datasets, integrating multiple sources
Collaborate with stakeholders and delivery teams to drive outcomes
Core skills (indicative)
Production pipeline design and deployment experience
Strong engineering capability with Python, SQL, plus big data tooling (e.g., Spark, and Java/Scala where relevant
AWS, Azure, GCP
Working pattern
Hybrid working, with the team on client site or in the office a minimum of two days per week. xkybehq Actual time and location will vary by role or assignment.