Interim Data Strategy Consultant Start: ASAP Day rate: Open, Outside IR35 Reports to: CFO The brief A business in a competitive, mature market is seeking an experienced Data Strategy Consultant to carry out an independent assessment of its data landscape and deliver a clear, actionable roadmap for improvement. The company runs Microsoft Dynamics ERP … alongside legacy AS400 and other systems (including Shopify), with no single source of truth. Data from external sources is also being added to ERP, increasing complexity. The business believes much of its data is dormant or underused, but needs a clear, evidence-based view of what’s in place, what can be improved, and where the … duplication, gaps, and quality issues. Assess the accessibility and potential value of key data sets. Determine whether a centralised data platform (e.g., datawarehouse, Fabric, other) is required, and outline appropriate architecture options. Recommend governance foundations, including ownership, definitions, and processes. Produce a 12–18 month roadmap detailing priorities, quick wins, resourcing needs More ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Datawarehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with … Data Governance teams to establish data lineage, data quality, and access control mechanisms. Engage with data stewards and stakeholders to build a comprehensive and scalable data warehouse. Implement RBAC, data masking, and encryption practices to ensure compliance with data security policies. Qualifications & Skills 10+ years of … experience in designing enterprise data platforms, with at least 5+ years in Snowflake. Strong expertise in SQL and data warehousing. Hands-on experience working in the insurance industry; prior experience with L&G is an advantage. 3+ years of experience with DBT for data transformation. Deep understanding of agile methodologies in dataMore ❯
Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: Date: 11 Jul 2025 Company: Royal London Group Job Title: Lead Data Engineer Contract Type: Permanent Location: Edinburgh or Alderley Park Working style: Hybrid 50% home/office based Closing date: 20th August 2025 We are seeking to hire a Lead … Data Engineer to design, develop and operate automated data engineering platforms. This role will contribute to the Group by improving the speed, accuracy, consistency, repeatability, and overall quality of data processing. This reduces costs and enables the Data Automation team to focus on more complex strategic tasks. The Lead Data Engineer is responsible for leading a small team of Data Engineers to develop and maintain large-scale data processing systems, pipelines, and infrastructure. The role entails working closely with the wider data community within Royal London to support data-driven decision making for the organisation. About the role Lead and More ❯
Role: Snowflake Data Architect Location: Hove, UK Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-end architecture of Datawarehouse on Snowflake. Create and maintain conceptual, Logical and Physical Data Models in Snowflake. Design Data Pipelines and ingestion frameworks using Snowflake native tools. Work with Data Governance teams to establish Data lineage, Data quality and access control mechanisms. Engage with Data stewards and other stake holders to build a comprehensive & scalable Data warehouse. Implement RBAC, Data Masking & encryption practices to ensure compliance with Data Security policies. You Must Posses 10+ years of … experience in designing Enterprise Data Platforms with atleast 5+ years in Snowflake. Strong Expertise in SQL, Data Warehousing. Hands on experience working in Insurance 3+ years of experience in DBT for Data Transformation. Deep understanding of Agile methodologies in Data environment. FamiliaritywithPowerBI. More ❯
Job summary An exciting opportunity has arisen for a Senior Health Intelligence Analyst to join the Data & Analytics team within the newly established Value Transformation directorate of NHS Wales Performance and Improvement. The work will be wide ranging and varied as the team is tasked with providing a health intelligence service to the range of Directorates within NHS … Networks & Planning directorate but will also contribute to the wider remit of the service and react to changing priorities of our partners. Main duties of the job The Data & Analytics team encompasses data, analysis, information and knowledge, ensuring decision makers have the right intelligence at the right time to improve the quality, efficacy and efficiency of … linking complex data sets to derive meaningful analysis, using a range of tools and techniques to deliver health intelligence covering data extraction, datawarehouse, data analysis and data presentation Experience of understanding and resolution of data quality issues, and providing and presenting highly complex information to clinicians More ❯
in-office, flexible in practice) 50,000- 55,000 DoE Job Ref: J12976 This is a fantastic opportunity to join a forward-thinking organisation that's expanding their data capabilities. As a Business Intelligence Developer you'll be at the heart of driving strategic data initiatives, collaborating with key stakeholders, and delivering impactful insights across the … BI products to life Translate business needs into technical solutions using Agile methods Build and maintain scalable data models and system architecture Use the datawarehouse to empower decision-making via self-service analytics Ensure robust data governance, security, and compliance (GDPR, etc.) Mentor junior BI developers and promote a culture of data … reliability, automation, and reusability Experience Required: Strong experience with Power BI and an eye for compelling, intuitive data visualisation Solid background in datawarehouse design (Kimball methodology, fact/dim tables) Proficient in the Azure stack - including ADF and Azure SQL Familiarity with Agile development; DevOps experience a bonus Bonus points for experience More ❯
Devonshire Hayes have partnered with a leading insurance provider committed to innovation, customer-centric solutions, and data-driven decision making. We aim to leverage data and analytics to transform risk management, underwriting, claims processing, and customer experience. We are seeking a strategic Chief Data Officer to lead our data vision and governance. … Role The Chief Data Officer will be responsible for developing and implementing a comprehensive data strategy that drives business growth, operational efficiency, and regulatory compliance. This senior leadership role will oversee data governance, data quality, analytics, and advanced data technologies to unlock value from the company's data assets. Key Responsibilities Develop and execute a company-wide data strategy aligned with business objectives. Lead data governance frameworks to ensure data quality, integrity, privacy, and security. Build and oversee the data management infrastructure, including data warehouses, lakes, and analytics platforms. Drive advanced analytics, machine learning, and AI More ❯
Reporting to the Data Platforms Manager, you will design, develop, and maintain scalable, cost-effective data solutions on the Azure Data Platform. You'll support the creation and management of Gold layer data models following Medallion Architecture, ensuring alignment with platform best practices and governance standards. Working collaboratively, you'll help review … document, and migrate data from a recently acquired company, while identifying opportunities to automate inefficiencies and optimise non-compliant solutions. Ideally you will be based in Aberdeen, and willing to work hybrid working of 3 days onsite, 2 from home, but we will also consider London based or remote candidates. What you'll do Design and implement scalable … Strong Proficiency in Azure Data Factory, Synapse Analytics, SQL Databases & Data Lake Gen 2 Demonstratable understanding on how to apply Medallion Architecture to an Azure Datawarehouse Advanced Knowledge around CI/CD pipelines, deployment automation, Infrastructure as Code and work management within Azure DevOps Knowledge of development within Databricks, PySpark, Delta Lake, Unity Catalog and Notebook More ❯
AWS Data Engineer – Leeds/Hybrid - Upto £60K We’re looking for a Data Engineer to join a growing data team and play a key role in designing, building, and maintaining cloud-native data platforms. The role is based in Leeds (2 days per week on-site) with flexibility for hybrid working. … The role: Design and implement scalable data architectures in the cloud, ensuring secure and reliable data pipelines. Work across the full data lifecycle, supporting data scientists, analysts, and engineering teams. Lead development projects, data modelling, and cloud data platform deployments. Mentor data engineers and contribute to … Strong experience with AWS data services (S3, Redshift, Glue, Lambda, Lake Formation, CloudFormation). Proficiency in SQL, Python (Pandas), Spark or Iceberg. Experience with datawarehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or More ❯
an experienced and strategic SAP Professional to lead the design, implementation, and optimization of solutions using SAP Datasphere. This role will be responsible for managing and implementing cloud data integration, analytics strategy, and ensuring business insights are delivered effectively across the organization. The candidate should have a fair knowledge of SAP S4/HANA Professional Services processes. This … position reports to a TCS Engagement Manager, working in strong collaboration with all other client stakeholders, especially with the Product Owners, Data CoE, and other Architects. As such, the role provides broad exposure to business stakeholders and requires strong business process knowledge on Professional Services. Responsibilities Architect, manage, and maintain the enterprise data landscape within SAP … Experience with SAP BTP, S/4HANA, and integration tools (e.g., SAP Data Intelligence, CPI). SAP Certified Application Associate SAP Datasphere (or SAP DataWarehouse Cloud). Referential experience in Professional Services implementation is an added advantage. More ❯
HD-TECH are proud to be supporting a well-established and fast-growing Microsoft-focused data consultancy in their search for a Principal Data Consultant. Our client is a recognised leader in the delivery of cloud-first data platform solutions, analytics, and managed services for enterprise organisations. Key Responsibilities Lead the technical delivery of … complex Microsoft Azure data projects Architect cloud-native data platform solutions aligned with business needs Act as a technical escalation point and mentor for other consultants Support business development activities including solution design and work scoping Deliver high-quality technical outputs and ensure best practices are followed Maintain up-to-date certifications and contribute to the … for data engineering, transformation, and automation ETL/ELT pipelines across diverse structured and unstructured data sources Data lakehouse and datawarehouse architecture design Power BI for enterprise-grade reporting and visualisation Strong knowledge of data modelling, SQL, and governance best practices Consulting & Project Delivery Experience 8+ years in More ❯
HD-TECH are proud to be supporting a well-established and fast-growing Microsoft-focused data consultancy in their search for a Principal Data Consultant. Our client is a recognised leader in the delivery of cloud-first data platform solutions, analytics, and managed services for enterprise organisations. Key Responsibilities Lead the technical delivery of … complex Microsoft Azure data projects Architect cloud-native data platform solutions aligned with business needs Act as a technical escalation point and mentor for other consultants Support business development activities including solution design and work scoping Deliver high-quality technical outputs and ensure best practices are followed Maintain up-to-date certifications and contribute to the … for data engineering, transformation, and automation ETL/ELT pipelines across diverse structured and unstructured data sources Data lakehouse and datawarehouse architecture design Power BI for enterprise-grade reporting and visualisation Strong knowledge of data modelling, SQL, and governance best practices Consulting & Project Delivery Experience 8+ years in More ❯
Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality, integrity, and governance throughout the data lifecycle. Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions. Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong More ❯
Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality, integrity, and governance throughout the data lifecycle. Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions. Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong More ❯
Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality, integrity, and governance throughout the data lifecycle. Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions. Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong More ❯
Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality, integrity, and governance throughout the data lifecycle. Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions. Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong More ❯
london (city of london), south east england, united kingdom
HCLTech
Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality, integrity, and governance throughout the data lifecycle. Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver data solutions. Implement and maintain CI/CD pipelines for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong More ❯
Data Engineer London/Hybrid - UK/Remote About Snowplow: Snowplow is the global leader in customer data infrastructure for AI, enabling every organization to transform raw behavioral data into governed, high-fidelity fuel for AI-powered applications-including advanced analytics, real-time personalization engines, and AI agents. Digital-first … companies like Strava, HelloFresh, Auto Trader, Burberry, and DPG Media use Snowplow to collect and process event-level data in real time, delivering it securely to their warehouse, lake, or stream, and integrate deep customer context into their applications. Thousands of companies rely on Snowplow to uncover customer insights, predict customer behaviors, hyper-personalize customer experiences, and … for automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud datawarehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be More ❯
Wellingborough, Northamptonshire, United Kingdom Hybrid / WFH Options
Coba IT Consultants
on behalf of a growing financial services organisation seeking a Senior MI/BI Developer to join their dynamic technology team. This is an exciting opportunity for a data-driven professional who thrives in a collaborative, customer-centric environment and is ready to lead on data integration and business intelligence initiatives. Coba IT is resourcing on … behalf of a growing financial services organisation seeking a Senior MI/BI Developer to join their dynamic technology team. This is an exciting opportunity for a data-driven professional who thrives in a collaborative, customer-centric environment and is ready to lead on data integration and business intelligence initiatives. The Role: You'll be responsible … projects What We're Looking For: Essential Skills: 5+ years' experience with Informatica (Cloud and/or PowerCenter) Strong background in data modelling and datawarehouse design Proven leadership in delivering end-to-end BI/datawarehouse projects Experience with Power BI administration and development Desirable Skills: Knowledge of Azure DataMore ❯
Data Engineer, Prime Video Core Analytics and Tooling Job ID: Amazon Digital UK Limited Come build the future of entertainment with us. Are you interested in shaping the future of movies and television? Do you want to define the next generation of how and what Amazon customers are watching? Prime Video is a premium streaming service that offers … innovating on behalf of our customers is at the heart of everything we do. If this sounds exciting to you, please read on. The team owns a global data platform that powers analytics and data science within Prime Video. Building on AWS cloud technology and processing some eye-watering volumes of relational data, our … solve datawarehousing problems on a massive scale and apply cloud-based AWS services to solve challenging problems around: big data processing, datawarehouse design, self-service data access, automated data quality detection and building infrastructure as a code. You'll be part of the team that focuses on More ❯
Are you ready to take your data engineering skills to the next level? We are looking for a talented Data Engineer to join our innovative team and work closely with our Data Lead to implement a cutting-edge Business Intelligence (BI) platform. This is your chance to make a significant impact on our company … s reporting and analytics capabilities! About the Role As a Data Engineer, you will play a crucial role in transforming our organisation into a data-driven enterprise. Your primary responsibilities will include data ingestion from diverse source systems, development of data pipelines through both ETL tools and programming languages such as Python. … and Azure Data Factory (ADF), and you know how to turn complex data challenges into streamlined, reliable systems. You will excel in datawarehouse modelling, especially with Kimball methodology and Medalion architecture approaches. With your sharp analytical skills and a passion for problem-solving, you're not just about the numbers - you love More ❯
Senior Delivery Consultant -Data Analytics & GenAI, AWS Professional Services Public Sector Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Data Analytics and GenAI consulting specialist? Do you have real-time Data Analytics, DataWarehousing, Big Data, Modern Data Strategy, Data Lake, Data … Do you have senior stakeholder engagement experience to support pre-sales and deliver consulting engagements? Do you like to solve the most complex and high scale (billions+ records) data challenges in the world today? Do you like leading teams through high impact projects that use the latest data analytics technologies? Would you like a career path … that enables you to progress with the rapid adoption of cloud computing? AWS Professional Services Public Sector ANZ are hiring a highly technical senior cloud architect specialised in Data Analytics and GenAI to collaborate with our customers and partners to derive business value from the latest Data Analytics and GenAI services. Our consultants will develop and More ❯
Data Modeller x2 +3 months contract + extensions +Fully remote working +Inside IR35 +£750 - £875 a day Skills: +Data Vault modelling +Data modeler for Data Warehouses Are you a highly skilled Data Modeller with a strong background in enterprise data architecture and hands-on modelling experience? We are seeking a proactive … individual to join a dynamic team working on large-scale data initiatives across complex, multi-datacentre environments. Key Responsibilities Design and develop conceptual, logical, and physical data models across enterprise-scale projects. Implement data solutions using RDBMS , ODS , data marts , and data lakes across both SQL and NoSQL platforms. Translate … business needs into long-term, scalable data architecture. Collaborate with analysts, architects, and developers to ensure models align with enterprise standards. Support metadata management , data lineage, and governance best practices. Required Skills & Experience 5+ years of hands-on data modelling experience across various domains. Minimum 2 years' experience with Data Vault modelling More ❯
Due to continued growth, we are currently looking for a Data Engineer to join our Professional Services division. You will be part of a cross-functional Data Consulting team spanning data engineering, data science, AI, analytics, and visualisation. You will work with clients across multiple sectors, helping them explore next-generation data techniques, AI capabilities, and tools to drive measurable business value from their data assets. A day in the life of an Aiimi Data Engineer: Collaborating with business subject matter experts to discover valuable insights in structured, semi-structured, and unstructured data sources. Using data engineering and AI techniques to help … clients make smarter decisions, reduce service failures, and deliver better customer outcomes. Connecting to and extracting data from source systems, applying business logic and transformations, and enabling data-driven decision-making. Supporting strategic planning and identifying opportunities to apply AI models or machine learning techniques to enhance business processes. Capturing data requirements from customer More ❯
Due to continued growth, we are currently looking for a Data Engineer to join our Professional Services division. You will be part of a cross-functional Data Consulting team spanning data engineering, data science, AI, analytics, and visualisation. You will work with clients across multiple sectors, helping them explore next-generation data techniques, AI capabilities, and tools to drive measurable business value from their data assets. A day in the life of an Aiimi Data Engineer: Collaborating with business subject matter experts to discover valuable insights in structured, semi-structured, and unstructured data sources. Using data engineering and AI techniques to help … clients make smarter decisions, reduce service failures, and deliver better customer outcomes. Connecting to and extracting data from source systems, applying business logic and transformations, and enabling data-driven decision-making. Supporting strategic planning and identifying opportunities to apply AI models or machine learning techniques to enhance business processes. Capturing data requirements from customer More ❯