Job Title: Data Modeler Role Summary: We are seeking a skilled Data Modeler to design, develop, and maintain enterprise data models that support business intelligence, analytics, and application development initiatives. The role involves working closely with business stakeholders, data architects, and engineering teams to translate business requirements into logical, physical, and conceptual data models, ensuring data … integrity, scalability, and alignment with enterprise standards. Key Responsibilities: Design and implement conceptual, logical, and physical data models to support business and technical requirements. Collaborate with data architects, analysts, and engineers to ensure accurate translation of business needs into data solutions. Maintain data dictionaries, metadata, and documentation for consistency and reusability across the organization. Optimize data models … data integration processes by providing modeling inputs. Work on relational, dimensional, and NoSQL data modeling depending on project needs. Required Skills & Experience: Strong experience in data modeling concepts (conceptual, logical, physical, dimensional, OLTP & OLAP). Proficiency in modeling tools such as ERwin, ER/Studio, PowerDesigner, or similar. Solid knowledge of RDBMS (Oracle, SQL Server, DB2, PostgreSQL, etc. More ❯
Contract: 6-month initial contractRate: £750-900 a day, PAYE OR Umbrella In this role, you'll design and implement Lakehouse Architectures, create conceptual-to-physical data models, and drive data governance frameworks. You'll also enable seamless integration across complex systems to support valuable analytics and decision-making capabilities. Key Responsibilities: Design enterprise data architectures for … key business functions such as Trade, Sales Ops, and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders More ❯
capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on data architecture, and ensure data is efficiently stored and managed within our enterprise data lakehouse. At E.ON Next we're looking for the most ambitious, talented, and … through enterprise data modelling, planning ahead for the impact of data changes over time. Design technical architecture solutions for machine learning and AI applications. You will use design data models to ensure our data vision is achieved, ensuring that data is managed properly and meets the needs of our business requirements. You will design efficient and scalable data pipelines … architecturing. Knowledge and experience of Amazon Web Services is essential with a strong understanding of cloud based design particularly within AWS. Be comfortable with taking ownership of complex data models within data engineering projects, and develop appropriate solutions in accordance with business requirements. Hand-on experience with working with stakeholders and managing their requirements. Actively coach and mentor others More ❯
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: ·Build and maintain scalable data pipelines for analytics and … AI use cases. ·Lead the deployment of production ready AI models with end-to-end monitoring. ·Support the development of Python coding standards and engineering best practices. ·Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both More ❯