Data Architect - Bristol - Hybrid

Your new company

They are a specialist insurance and risk solutions provider, supporting clients with tailored coverage and expert advice across a range of sectors. The business is known for its client‐focused approach, strong market relationships and commitment to delivering practical, dependable solutions.

With a collaborative culture and a focus on professional development, they offer a supportive environment where people are trusted, valued and encouraged to grow their careers within a forward‐thinking organisation.

Your new role

As a Data Architect, you'll play a key role in shaping how data is designed, managed and used across the business. You'll set the architectural direction for the data estate - from the point data first lands on the platform, through the Bronze, Silver and Gold layers of the Medallion Architecture, and all the way to analytics, AI and self-service reporting.

Working within the Microsoft Azure and Databricks ecosystem, you'll help build a data platform that's scalable, flexible and built to last. Your work will directly support high‐impact use cases, including advanced analytics, pricing models, AI/ML solutions and regulatory reporting - ensuring teams across the business can trust and use data with confidence.

Data Architecture & Modelling

  • Define and own the architectural principles, standards and policies governing SBG's data estate from the landing zone through to the Gold layer.
  • Design and govern the Medallion Architecture (Bronze / Silver / Gold), ensuring every layer is built for analytics, AI/ML and self-service consumption.
  • Own data modelling standards - conceptual, logical and physical - and ensure models are fit for both regulatory reporting and AI-driven insight.
  • Define Unity Catalogue structure, metadata standards and data lineage governance across the estate. Data Ingestion & Processing
  • Define ingestion standards and data contracts for data arriving from the landing zone into the Bronze layer, working in partnership with the Development and Application Management team.
  • Design and optimise ETL/ELT pipeline frameworks using Databricks, Delta Lake and Azure Data Factory. * Ensure Silver and Gold layer data products are fit for purpose for analytics, pricing, AI and ML model consumption.
  • Optimise data pipelines for efficiency, cost-effectiveness and high performance, leveraging Databricks for big data processing and machine learning.

Governance & Standards

  • Act as the architectural authority for the data estate - reviewing designs, enforcing standards and preventing platform fragmentation as SBG scales.
  • Ensure all data architecture decisions align with regulatory requirements - FCA, GDPR, Solvency II, IFRS 17 and BCBS 239.
  • Define and maintain data architecture policies and guidelines ensuring long-term scalability and sustainability.

Analytics & AI Enablement

  • Design the Gold layer to ensure data products are structured, documented and accessible for self-service analytics and AI/ML model consumption.
  • Collaborate with ML Ops and Data Science teams to define data product standards and feature engineering patterns.
  • Evaluate and lead adoption of emerging Azure and Databricks capabilities - including Microsoft Fabric, OneLake and DirectLake - where they advance the data architecture.
  • Drive innovation by evaluating and implementing emerging cloud-based data technologies to enhance SBG's competitive advantage.

What you'll need:

  • Strong stakeholder management across business, IT and compliance teams.
  • Excellent communication, collaboration and influencing skills at all levels of an organisation.
  • Experience leading data architecture and engineering teams in an enterprise environment.
  • Ability to define and implement a data strategy aligned with business objectives.
  • Proven track record of delivering enterprise-scale data solutions with a focus on performance, security and scalability.
  • Experience in regulated financial services, ensuring compliance with industry standards.
  • Deep expertise in data modelling - conceptual, logical and physical.
  • Data warehousing and data lake architecture for high-performance analytics.
  • ETL/ELT pipeline development and optimisation to support large-scale data processing.
  • Data integration across structured and unstructured sources, ensuring high availability.
  • Metadata management and governance to maintain data quality and lineage.
  • Experience defining data contracts and ingestion standards between source delivery teams and the data estate.
  • Deep expertise in Microsoft Azure cloud services - ADF, ADLS, Synapse, Purview.
  • Databricks - Delta Lake architecture, optimisation and advanced data processing.
  • Apache Spark for large-scale distributed computing and performance tuning.
  • Microsoft Fabric - OneLake and DirectLake integration.
  • Azure Synapse Analytics for enterprise-scale data warehousing.
  • Infrastructure-as-Code (Terraform or Azure Bicep) to automate cloud deployments.
  • CI/CD pipelines with Azure DevOps or GitHub Actions for automated deployment of data pipelines.
  • MLOps best practices - MLflow, Databricks Model Serving, Feature Store.
  • Knowledge of IFRS 17, BCBS 239, UK Data Protection Act and Solvency II compliance.
  • Experience with pricing models, claims processing and fraud detection in the insurance sector.
  • Strong problem-solving skills and ability to translate business needs into technical solutions.
  • Ability to document and present complex data architectures to technical and non-technical stakeholders

What you'll get in return

  • Hybrid working - 2 days in the office and 3 days working from home
  • 25 days annual leave, rising to 27 days over 2 years' service and 30 days after 5 years' service. Plus bank holidays!
  • Discretionary annual bonus
  • Pension scheme - 5% employee, 6% employer

& many more

What you need to do now

If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV.

If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Desired Skills and Experience

Data Architect, Medallion experience, Snowflake, Azure & Databricks, AI & ML solutions.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers.

By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk

Job Details

Company
Hays
Location
Bristol, England, United Kingdom
Hybrid / Remote Options
Posted