Job Title: Data Modeler Role Summary: We are seeking a skilled Data Modeler to design, develop, and maintain enterprise data models that support business intelligence, analytics, and application development initiatives. The role involves working closely with business stakeholders, data architects, and engineering teams to translate business requirements into logical, physical, and conceptual data models, ensuring data … integrity, scalability, and alignment with enterprise standards. Key Responsibilities: Design and implement conceptual, logical, and physical data models to support business and technical requirements. Collaborate with data architects, analysts, and engineers to ensure accurate translation of business needs into data solutions. Maintain data dictionaries, metadata, and documentation for consistency and reusability across the organization. Optimize data models … data integration processes by providing modeling inputs. Work on relational, dimensional, and NoSQL data modeling depending on project needs. Required Skills & Experience: Strong experience in data modeling concepts (conceptual, logical, physical, dimensional, OLTP & OLAP). Proficiency in modeling tools such as ERwin, ER/Studio, PowerDesigner, or similar. Solid knowledge of RDBMS (Oracle, SQL Server, DB2, PostgreSQL, etc. More ❯
we are seeking a highly experienced Senior Data Modeller to design and maintain enterprise wide conceptual, logical data and physical models for our organisation. This role is critical to enabling consistent, scalable and business aligned data structures that support analytics, reporting and operational efficiency. The ideal candidate will have a deep knowledge and understanding of insurance data … Key responsibilities: Develop and maintain the enterprise wide conceptual data model to provide a high level view of data domains, entities and their relationships, and logical data models with detailed attributes, data relationships, and normalization rule, ensuring alignment with business rules and objectives Work closely with business stakeholders to understand data requirements and translate them into structured … data model that meet organisational needs Ensure conceptual and logical data models are reusable, scalable and adaptable to evolving business demands Partner with business stakeholders, SME and data architect to validate data models and incorporate feedbacks Qualifications Skills & Experience: 10+ years of experience in data modelling, with a strong focus on conceptual and logical More ❯
Contract: 6-month initial contractRate: £750-900 a day, PAYE OR Umbrella In this role, you'll design and implement Lakehouse Architectures, create conceptual-to-physical data models, and drive data governance frameworks. You'll also enable seamless integration across complex systems to support valuable analytics and decision-making capabilities. Key Responsibilities: Design enterprise data architectures for … key business functions such as Trade, Sales Ops, and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders More ❯
AI-first approach to data-driven decision-making. What You'll Be Doing Architecting Advanced Data Solutions Shape End-to-End Data Architectures: Define conceptual and logical models for data lakes, warehouses, and hybrid deployments (AWS, Azure/Fabric), aligning with key performance and cost objectives. Embed Automation & Monitoring: Reduce complexity and maximise uptime through DevOps practices … time-to-market while managing terabytes of data daily. Driving AI & ML Initiatives Champion LLMOps & MLOps Strategy: Set the architectural framework for building, training, and deploying AI/ML models with automation and regulatory compliance in mind. Enhance Operational Efficiency: Oversee end-to-end workflows (model improvement, versioning, monitoring) to support real-time personalisation and advanced analytics, focusing on More ❯
capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on data architecture, and ensure data is efficiently stored and managed within our enterprise data lakehouse. At E.ON Next we're looking for the most ambitious, talented, and … through enterprise data modelling, planning ahead for the impact of data changes over time. Design technical architecture solutions for machine learning and AI applications. You will use design data models to ensure our data vision is achieved, ensuring that data is managed properly and meets the needs of our business requirements. You will design efficient and scalable data pipelines … architecturing. Knowledge and experience of Amazon Web Services is essential with a strong understanding of cloud based design particularly within AWS. Be comfortable with taking ownership of complex data models within data engineering projects, and develop appropriate solutions in accordance with business requirements. Hand-on experience with working with stakeholders and managing their requirements. Actively coach and mentor others More ❯
Watford, England, United Kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
stevenage, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
watford, hertfordshire, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: ·Build and maintain scalable data pipelines for analytics and … AI use cases. ·Lead the deployment of production ready AI models with end-to-end monitoring. ·Support the development of Python coding standards and engineering best practices. ·Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across More ❯
newport, wales, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both More ❯
bath, south west england, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
bradley stoke, south west england, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯