Data Engineer
Data Engineer
£50-58k basic + 5% bonus
Hybrid | Manchester
A growing organisation is seeking a skilled and motivated Data Engineer to join its data platform team and help build the next generation of its Azure and Microsoft Fabric‐based data estate. This role is ideal for an engineer who enjoys designing robust pipelines, shaping modern data layers, and working closely with analysts, architects, and business stakeholders to deliver high‐quality, governed data products.
If you want to work with cutting‐edge Azure technologies, contribute to a modern medallion‐based architecture, and play a key role in developing scalable, reliable data solutions, this is an excellent opportunity.
The Opportunity
You will design, build, and operate data pipelines and platform components that power analytics, reporting, and operational systems across the organisation. Working within a modern cloud environment centred on Microsoft Fabric, you will help evolve the data platform, improve data quality, and support the delivery of trusted, well‐structured data products.
Benefits
- Basic salary £50-57k
- A discretionary bonus scheme based on company and personal performance (defined each year, usually around 5%)
- A hybrid work arrangement – 3 days a week in Manchester office
- A pension scheme of 5% matched contribution
- Company shares scheme – once a year you can buy shares at a discounted rate and pay for these monthly over 3 years before cashing them in or getting your saved money back.
- Life Assurance x4 salary
- 33 days holidays (including bank holidays)
- EAP
- Benefits portal providing discounts to many leading brands
Key Responsibilities
Data Pipeline and Platform Engineering
- Design, build, and maintain scalable ELT/ETL pipelines using Azure Data Factory and Microsoft Fabric.
- Implement and maintain medallion architecture (Bronze, Silver, Gold) across the data platform.
- Engineer reliable ingestion from APIs, databases, flat files, and event streams.
- Monitor pipeline performance, troubleshoot failures, and implement observability and alerting practices.
Master Data and Data Quality
- Support the design and operation of master data management frameworks.
- Apply data quality rules, validation logic, and deduplication processes.
- Work with business teams to define canonical entities, hierarchies, and golden records.
Azure and Cloud Platform Engineering
- Work hands‐on with Azure services including ADLS, Azure SQL, ADF, and Event Hubs.
- Manage storage layers including Delta Lake and OneLake within Microsoft Fabric.
- Contribute to infrastructure‐as‐code practices and environment provisioning.
Integrations and APIs
- Design and maintain integration patterns across internal systems and third‐party platforms.
- Build and consume REST APIs and event‐driven integrations to support near real‐time data flows.
- Collaborate with software engineers and architects to ensure secure, scalable integrations.
Required Skills and Experience
- Solid hands‐on experience in data engineering, including pipelines, ETL/ELT, and data modelling.
- Proficiency with Azure Data Factory for orchestration and data movement.
- Experience with or strong interest in Microsoft Fabric and its lakehouse and pipeline capabilities.
- Practical understanding of medallion architecture and structured data layers.
- Familiarity with master data management concepts.
- Strong skills in Python and/or SQL for transformation, automation, and quality checks.
- Experience integrating with APIs and designing robust data integration patterns.
- Good understanding of Azure services including ADLS, Azure SQL, Key Vault, and Event Hubs.
Desirable Skills
- Experience with Fabric Lakehouses, Warehouses, and Data Pipelines.
- Exposure to streaming or event‐driven patterns such as Event Hubs or Kafka.
- Familiarity with DevOps, CI/CD, Git, and Azure DevOps.
- Knowledge of data governance, cataloguing, or Microsoft Purview.
- Background in high‐volume operational or analytical environments.