Contract: 6-month initial contractRate: £750-900 a day, PAYE OR Umbrella In this role, you'll design and implement Lakehouse Architectures, create conceptual-to-physical data models, and drive data governance frameworks. You'll also enable seamless integration across complex systems to support valuable analytics and decision-making capabilities. Key Responsibilities: Design enterprise data architectures for … key business functions such as Trade, Sales Ops, and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders More ❯
capability within the Data Team. As part of our journey we're looking for a data architect to help bring our vision to life to design and review data models, iterate on data architecture, and ensure data is efficiently stored and managed within our enterprise data lakehouse. At E.ON Next we're looking for the most ambitious, talented, and … through enterprise data modelling, planning ahead for the impact of data changes over time. Design technical architecture solutions for machine learning and AI applications. You will use design data models to ensure our data vision is achieved, ensuring that data is managed properly and meets the needs of our business requirements. You will design efficient and scalable data pipelines … architecturing. Knowledge and experience of Amazon Web Services is essential with a strong understanding of cloud based design particularly within AWS. Be comfortable with taking ownership of complex data models within data engineering projects, and develop appropriate solutions in accordance with business requirements. Hand-on experience with working with stakeholders and managing their requirements. Actively coach and mentor others More ❯
Watford, England, United Kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
stevenage, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
watford, hertfordshire, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
Addition+
from the front. What You’ll Be Doing: Designing and developing robust data pipelines to centralise and clean data from multiple sources Building conceptual and physical data models to support reporting and analysis Championing data governance and implementing controls to ensure quality and compliance Developing automated ETL solutions using Microsoft Azure tools Leading best practice in engineering … Azure (Data Lake, Synapse, SQL Server) Comfortable with Python, C#, and working with APIs Familiar with cloud data platforms (Azure essential; AWS useful) Confident designing data architecture and building models from scratch Solid understanding of database management and governance principles Skilled in Excel, including VBA for analysis and automation Strong problem-solver, commercially aware, and naturally collaborative What’s More ❯
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: ·Build and maintain scalable data pipelines for analytics and … AI use cases. ·Lead the deployment of production ready AI models with end-to-end monitoring. ·Support the development of Python coding standards and engineering best practices. ·Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across More ❯
newport, wales, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both More ❯
bath, south west england, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯
bradley stoke, south west england, united kingdom Hybrid / WFH Options
Datatech Analytics
Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on bringing AI models into production, ensuring robust monitoring, alerting, and data pipeline reliability are in place to support long-term success. Key Responsibilities: Build and maintain scalable data pipelines for analytics and … AI use cases. Lead the deployment of production ready AI models with end-to-end monitoring. Support the development of Python coding standards and engineering best practices. Collaborate closely with IT and data teams to enhance the platform and unlock new capabilities. In addition to the technical work, this role will help improve data governance and quality processes across … the organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks – including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across More ❯