Senior Data Engineer

Senior Data Engineer | Manchester (Hybrid) | Digital Transformation Consultancy

Location: Manchester (Hybrid - 3 days a week)

Package: £60k-£70k + Benefits

Python | Cloud (AWS/Azure/GCP) | Kafka | Airflow | PostgreSQL | Observability

This client builds mission‐critical digital platforms that make the UK smarter, safer, greener and healthier. They are a growing digital transformation consultancy delivering enterprise‐scale data services across public and regulated sectors — civil defence, healthcare, environment and land asset management.

This is a hands‐on Senior Data Engineer role where you’ll design, build and operate resilient data platforms and pipelines for high‐profile, societally important programmes. You’ll work in multi‐disciplinary teams, influence technical direction, and help shape solutions that are secure, observable and production‐ready.

ROLE: Senior Data Engineer

You’ll be responsible for delivering end‐to‐end data services on greenfield and transformation programmes. Expect to own core components of platforms, collaborate with data scientists, delivery leads, and mentor more junior engineers.

What You’ll Be Doing

  • Design and implement scalable ETL/ELT pipelines and data models for enterprise platforms
  • Build and operate event‐driven architectures (Kafka) and orchestrate workflows (Airflow)
  • Implement data quality, testing and validation as first‐class concerns across pipelines
  • Deliver monitoring and observability (Prometheus, Grafana or equivalent) and runbook automation
  • Work with relational (Postgres/SQL Server) and NoSQL stores; implement metadata and standards (geospatial metadata experience desirable)
  • Translate technical designs into robust implementations and clear documentation for stakeholders
  • Mentor colleagues and contribute to continuous improvement and best practices

What You’ll Bring

  • 5+ years in data engineering building production‐grade data platforms
  • Strong Python and SQL skills; experience with cloud data services (AWS/Azure/GCP)
  • Hands‐on Kafka (or equivalent streaming) and workflow orchestration (Airflow) experience
  • Proven track record implementing data quality, monitoring and observability practices
  • Comfortable designing data architecture patterns (data lakes, warehouses, event‐driven systems)
  • Experience working in Agile teams, stakeholder engagement and technical mentorship

Apply now to help shape resilient data platforms that power public‐facing services.

Job Details

Company
Loop Recruitment
Location
Manchester Area, United Kingdom
Hybrid / Remote Options
Posted