we are seeking a highly experienced Senior Data Modeller to design and maintain enterprise wide conceptual, logical data and physical models for our organisation. This role is critical to enabling consistent, scalable and business aligned data structures that support analytics, reporting and operational efficiency. The ideal candidate will have a deep knowledge and understanding of insurance data … Key responsibilities: Develop and maintain the enterprise wide conceptual data model to provide a high level view of data domains, entities and their relationships, and logical data models with detailed attributes, data relationships, and normalization rule, ensuring alignment with business rules and objectives Work closely with business stakeholders to understand data requirements and translate them into structured … data model that meet organisational needs Ensure conceptual and logical data models are reusable, scalable and adaptable to evolving business demands Partner with business stakeholders, SME and data architect to validate data models and incorporate feedbacks Qualifications Skills & Experience: 10+ years of experience in data modelling, with a strong focus on conceptual and logical More ❯
Contract: 6-month initial contractRate: £750-900 a day, PAYE OR Umbrella In this role, you'll design and implement Lakehouse Architectures, create conceptual-to-physical data models, and drive data governance frameworks. You'll also enable seamless integration across complex systems to support valuable analytics and decision-making capabilities. Key Responsibilities: Design enterprise data architectures for … key business functions such as Trade, Sales Ops, and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders More ❯
AI-first approach to data-driven decision-making. What You'll Be Doing Architecting Advanced Data Solutions Shape End-to-End Data Architectures: Define conceptual and logical models for data lakes, warehouses, and hybrid deployments (AWS, Azure/Fabric), aligning with key performance and cost objectives. Embed Automation & Monitoring: Reduce complexity and maximise uptime through DevOps practices … time-to-market while managing terabytes of data daily. Driving AI & ML Initiatives Champion LLMOps & MLOps Strategy: Set the architectural framework for building, training, and deploying AI/ML models with automation and regulatory compliance in mind. Enhance Operational Efficiency: Oversee end-to-end workflows (model improvement, versioning, monitoring) to support real-time personalisation and advanced analytics, focusing on More ❯
Watford, England, United Kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
stevenage, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
watford, hertfordshire, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: ·Build and maintain scalable data pipelines for analytics and … AI use cases. ·Lead the deployment of production ready AI models with end-to-end monitoring. ·Support the development of Python coding standards and engineering best practices. ·Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both More ❯
bath, south west england, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
bradley stoke, south west england, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯