that adventure. Our employees are at the heart of delivering impactful and meaningful work for our clients and helping them to reach for and realise their vision. Snowflake Data Architect Slalom is seeking an experienced Snowflake Data Architect to contribute to its growing Data Capability. The ideal candidate is a professional … with experience in designing, developing, validating and communicating enterprise data solutions using Snowflake. Deep experience in developing enterprise data management strategies including datalake/warehouse implementations, data movement, data services, data acquisition, data conversion, and archive/recovery. What will you do? Lead as …/advise clients on reference architectures Requirements: Expert in capturing end user requirements and aligning technical solutions to the business objectives Experience in leading, designing and implementing Cloud Data strategies, including designing multi-phased implementation roadmaps Significant experience (and ideally certified) in Snowflake and acted as lead data architect on at least two Snowflake implementations. Expertise More ❯
passionate about helping our people get more out of life too; building careers with real growth, a sense of purpose, belonging and wellbeing. About the role This Principal Data Engineer role will work exclusively within our AdTech business unit whilst ensuring alignment to best practices, standards, and policies. The strategic direction is led by the Data … from this role. This role will work within the AdTech squad focusing on their priorities that will enable a consistent, efficient, and effective use of the DataLake, Enterprise Data Warehouse, and marketing tools. Any experience with automating process and embedding re-useable patterns is interesting. As our AdTech Principal Data Engineer you … of services across AWS , Google Cloud , Celebrus and OneTrust Key Responsibilities Analysis, design, architecting and development of AdTech data solutions for deployments to the DataLake and Data Warehouse. Actively engaging with our Data Performance Unit on strategic direction, technology, and delivery excellence. Peer/code review for other team members. More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester(Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies)About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge … technologies and contribute to exciting projects in a collaborative environment.About the Role: Our client is seeking an experienced Data Engineer to join their team in Manchester. This hybrid position involves working within the pharmaceutical industry, focusing on the design, development, and maintenance of data pipelines, ETL processes, and databases. The role is ideal for someone passionate … in an agile team.Experience working in a highly regulated industry and with highly sensitive data.Exposure to large data solutions like Snowflake, Trino, Synapse, Azure DataLake, and Databricks.Experience in data science using R, Stata, or Python.Familiarity with Atlassian tools such as JIRA, Confluence, and BitBucket.Understanding of clinical trials, GCP, and GxP.What We Offer More ❯
Data Engineer Join a Trailblazing Team in Clinical Research Data ManagementStep into a role that places you at the cutting edge of clinical trials, leveraging Electronic Health Record (EHR) data to revolutionise patient care and research. Based in the innovative Manchester Science Park, this position offers the chance to be part of a world … is usually on Tuesdays and Thursdays but can be subject to change. Strong benefits package This role focuses on the development, performance, management, and troubleshooting of ETL processes, data pipelines, and data infrastructure. The successful candidate will ensure the effective and reliable operation of these systems, adopting new tools and technologies to stay ahead of industry … database testing; unit, performance, stress, security.- Experience in agile teams and highly regulated industries.- Exposure to large data solutions like Snowflake, Trino, Synapse, Azure DataLake, and Databricks.- Data science experience using R, Stata, or Python.- Familiarity with Atlassian tools; JIRA, Confluence, BitBucket.- Understanding of clinical trials, GCP, and GxP.Personal Attributes:- Strong collaboration More ❯
like you belong, and where you are empowered to succeed. We look forward to having you join our journey - and seeing how far we can go, together! Position Data Services Engineer at Conferma is responsible for: Enhancing and maintaining the Conferma Data Platforms As a data specialist you will support in the maintenance and … deployment of Conferma application & services at a data layer.This includes a combination of Microsoft SQL databases, batch file data & Microsoft Fabric data lakes. You will possess excellent troubleshooting skills and demonstrate the ability to make key technical decisions to ensure confidentiality, integrity & security of all platform data sets. Systems Analyst responsibilities: Providing … internal and external technical support for Conferma data platforms, working closely with L1-3 support teams. Working inside the Conferma SQL release pipeline to ensure consistency & reliability of product enhancements through non-production & production environments, including undertaking code reviews and approvals prior to release. Collaborating alongside the DBA function to assess proposed changes to platform dataMore ❯
and never miss the latest updates or an exclusive offer. Name Email Telephone Address Cover Note Upload CV What do you want to search? Keyword Apprenticeship Type Location Data Analyst Apprentice Apply From: 26/05/2025 Learning Provider Delivered by NOWSKILLS LIMITED Employer STANDGUIDE LTD Vacancy Description Develop and implement databases, data collection systems … and data engineering solutions that optimise reporting efficiency and clarity. Acquire data from primary or secondary data sources and maintain databases/data lakes. Interpret data, analyse results using statistical techniques, and provide ongoing reports. Identify, analyse, and interpret trends or patterns in complex data sets. Filter and … clean" data by reviewing raw data, reports, and performance indicators to locate and correct code problems. Work with management to prioritise business and information needs. Locate and define new process improvement opportunities. Key Details Vacancy Title Data Analyst Apprentice Employer Description Standguide has supported individuals and employers for over 30 years. We are a More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Michael Page Technology
This is an exciting opportunity for a Senior Data Engineer to contribute to impactful projects within the Business Services industry. Based in Manchester, this role focuses on designing, building, and optimising data pipelines and systems to support analytics and decision-making. Client Details We are a client acquisition agency with a focus on the legal sector … transparency, integrity, and a clear, results-oriented approach. We take pride in delivering carefully profiled leads that align with your specific requirements. We are looking for a Senior Data Engineer to support us as we continue to build and develop or data ecosystem. This role requires someone who is comfortable wearing many hats, from building out … pipelines, to supporting architectural decision making, to project and stakeholder management. In return the successful candidate will play a key role in helping us deliver our data strategy and take on more responsibility as the business continues to grow. Description The successful Senior Data Engineer will be responsible for, but not limited to: Design and maintain More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Pro Insurance
Pro currently has an exciting opportunity for a motivated and detail-oriented individual to join its Insurance Services team as a Data Analyst within the Digital Services area. The job holder will be involved in building data mapping processes for various bordereaux (both Risk and Claims) using data ingestion and analysis tools such as … Intrali, Quantemplate, or Matillion. The project focuses on the extraction, transformation, and loading (ETL) of data from various sources into data warehouses and data lakes. This is an ideal role for an aspiring Data Architect in the Insurance Industry. Pro operates a hybrid working policy, with time split between home and our … Gloucester or Liverpool office. This vacancy is also suitable for a fully remote individual, as we support home working. Main Duties and Responsibilities Data collection and processing; gathering data from various sources, pre-processing, and preparing datasets for analysis. Creation and management of both Risk and Claims data maps (models) and reconciliation of results. More ❯
Your new company We are currently collaborating with one of the largest global pharmaceutical companies to recruit an SAP Finance Data & Analytics Specialist on a contract basis. This organisation is currently undergoing significant change and growth projects, and to effectively manage these transitions, they are seeking a qualified finance professional with experience in a SAP S/… Your new role Duration: 12MonthsHybrid Working: 2 days per week on-site, 3 days per week remoteOutside IR35 [Ltd Comp] or Inside IR35 [Umbrella]As a SAP Finance Data & Analytics Specialist, you will play a pivotal role in ensuring the continuity of critical Finance reporting solutions while supporting the design and implementation of the S/4 HANA … S/4 HANA vs ECC differences in FI/CO modules. Experience with analytics tools and data platforms such as PowerBI, Qlik, Azure DataLake, Snowflake, SAP B4H, and SAP Analytics Cloud. Bachelor's or Master's degree in Finance or Accounting. What you'll get in return The role offers a competitive day More ❯
ll join the Platform & AI Enablement team within our GPTO Engineering organization, reporting to the Sr. Director Engineering API. The team is responsible for building and supporting the data and AI platforms that underpin our industry-leading Business Planning Software solutions. This is a highly hands-on role focused on designing, building, and operating scalable systems that integrate … event-driven, batch & streaming data flows, and AI capabilities into the platform. Your Impact Build and maintain core platform capabilities that support high-throughput batch, streaming, and AI-powered workloads. Develop resilient, observable, and scalable systems using Apache Kafka, Flink, Pulsar, and cloud-native tools. Collaborate with AI/ML engineers to operationalize models and enable generative AI … use cases such as prompt-based insights or automation. Deliver reliable integrations with datalakes, event stores, and analytics systems, ensuring data flows efficiently across the business planning ecosystem. Contribute full-stack code where needed, including React-based frontends and backend services in Java and Python. Write clean, maintainable, well-tested code with an emphasis on More ❯
Description As part of the Platform & AI Enablement team under GPTO Engineering, you'll report to the Sr. Director Engineering API. This team is accountable for shaping enterprise data architecture, enabling high-performance AI-driven workloads, and acting as a technical bridge between engineering and architecture. This is a hands-on role for a deeply experienced engineer who … thrives on solving complex problems and scaling robust platforms. Your Impact Influence the design and implementation of platform capabilities for data processing, AI enablement, and developer acceleration across batch, streaming, and real-time systems. Collaborate with the architecture function to represent engineering needs and help translate architectural direction into practical implementation patterns. Guide teams in integrating AI/… strategy discussions. Help teams balance speed and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in Apache Kafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, datalakes, and streaming pipelines. Strong experience More ❯