to leverage data, knowledge, and prediction to find new medicines. We are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: Building a next-generation data experience for GSK's scientists, engineers, and decision-makers, increasing productivity, and More ❯
ensuring reliability and scalability. Collaboration: Engage with internal stakeholders to understand business needs and translate them into technical solutions. Work across teams to improve workflows, standardize testing, versioning, and metadata management processes. We're looking for someone who: Has full-stack engineering experience, including data platforms and building RESTful APIs. Can quickly adapt to unfamiliar tech stacks, including low-code More ❯
Experience in CI/CD, IaC, DevOps, and Agile development environments. Strong understanding of software lifecycle, secure coding practices, and application monitoring/performance tuning. Experience with data modelling, metadata systems, or rights/licensing systems is a plus, especially in media or content businesses. Excellent communication skills with both technical and non-technical stakeholders. A strategic mindset with attention More ❯
ownership models, and quality improvement plans. Familiarity with key data regulations and standards (e.g., GDPR, CCPA, BCBS 239, ISO 8000, DAMA DMBOK). Working knowledge of data quality metrics, metadata management, and stewardship practices. Experience facilitating data forums or councils and shaping policy and data standards collaboratively. Certifications such as CDMP (Certified Data Management Professional) or DCAM, and PROSCI Change More ❯
Job Title Senior Manager - Data Management and Governance Location Asda House Employment Type Full time Contract Type Permanent Hours per Week 37.5 Salary Competitive Salary plus benefits Category Data Science Closing Date 1 June 2025 This role is based in More ❯
Vacancy for Project Archivist, Metadata Support at University of Oxford Vacancy for Project Archivist, Metadata Support at University of Oxford Weston Library, Oxford Fixed Term This pivotal role will support metadata creation and management activities associated with cataloguing and digitisation of archives and manuscripts being undertaken at the Bodleian Libraries. You will be critical to ensuring that metadata generated meets … the Bodleian Libraries' metadata guidelines, optimising its potential for supporting discovery, use and management of collections. You will achieve this through training and supporting colleagues working on cataloguing collection materials, leading on quality assurance of relevant metadata, and taking responsibility for publication and updating of relevant metadata in internal systems and in our public-facing catalogue, Bodleian Archives & Manuscripts. You … You will have a flexible approach, responding well to changing circumstances. You will have a high degree of technical literacy, with practical experience and advanced skills in working with metadata manipulation in various forms including spreadsheets and XML. You will enjoy the analytical work and problem-solving that is central to metadata work. Possession of a postgraduate qualification in archives More ❯
books A fantastic opportunity for a data-driven, analytical person with an eye for detail and a love of books Independent publisher Scribe UK is seeking a Sales and Metadata Assistant to join our small, dynamic team based in Bloomsbury, with a list that includes prize-winning and bestselling authors. You may have some experience in publishing or bookselling already … interest in industry best practice and proactively research new ways to sell more of our books online. Duties will include: Maintaining and updating Scribe's internal database. Drafting enhanced metadata, such as optimised copy and keywords. Initiating new metadata projects and measuring their impact. Checking ONIX feeds. Managing Scribe's Amazon Advantage account, including correcting errors, issuing takedown notices for … 6pm with a one-hour paid lunch break. To apply, send your CV and covering letter to Molly Slight at by 18 March, with the subject line "Sales and Metadata Assistant application". More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
fees) Key Responsibilities: Lead and own a defined set of banking data domains (e.g. loans, treasury, financial products) Act as the single point of accountability for data quality, lineage, metadata, and classification Coordinate with business and tech to ensure data is fit for purpose and well-governed Drive remediation of data quality issues, supporting analytics, reporting, and regulatory needs Define … DORA) Background: Proven experience as a Head of Data, Data Owner, or equivalent role in banking Strong knowledge of loan products, treasury data, or investment product data Familiar with metadata and data lineage tooling (e.g. Databricks, Collibra, Alation – beneficial but not essential) Excellent stakeholder management and cross-functional collaboration skills Able to work in a fast-paced consulting environment with More ❯
fees) Key Responsibilities: Lead and own a defined set of banking data domains (e.g. loans, treasury, financial products) Act as the single point of accountability for data quality, lineage, metadata, and classification Coordinate with business and tech to ensure data is fit for purpose and well-governed Drive remediation of data quality issues, supporting analytics, reporting, and regulatory needs Define … DORA) Background: Proven experience as a Head of Data, Data Owner, or equivalent role in banking Strong knowledge of loan products, treasury data, or investment product data Familiar with metadata and data lineage tooling (e.g. Databricks, Collibra, Alation – beneficial but not essential) Excellent stakeholder management and cross-functional collaboration skills Able to work in a fast-paced consulting environment with More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Robert Half
fees) Key Responsibilities: Lead and own a defined set of banking data domains (e.g. loans, treasury, financial products) Act as the single point of accountability for data quality, lineage, metadata, and classification Coordinate with business and tech to ensure data is fit for purpose and well-governed Drive remediation of data quality issues, supporting analytics, reporting, and regulatory needs Define … DORA) Background: Proven experience as a Head of Data, Data Owner, or equivalent role in banking Strong knowledge of loan products, treasury data, or investment product data Familiar with metadata and data lineage tooling (e.g. Databricks, Collibra, Alation – beneficial but not essential) Excellent stakeholder management and cross-functional collaboration skills Able to work in a fast-paced consulting environment with More ❯
client requirements into scalable and secure data architectures Drive infrastructure-as-code and CI/CD deployment practices Process structured and semi-structured data (JSON, XML, Parquet, CSV) Maintain metadata, build data dictionaries, and ensure governance is embedded by design Work across industries in fast-paced, high-value engagements This Principal Data Engineer will bring: Extensive experience with ETL/… ELT pipelines and data transformation patterns Proficiency in AWS cloud services , particularly Redshift, Glue, Matillion, and S3 Strong command of data quality, data lineage, and metadata practices Fluency in database technologies (both relational and NoSQL) Experience with Linux environments and data visualisation tools (e.g. Tableau, QuickSight, Looker ) Bonus points for: Familiarity with Hadoop, Spark, or MapReduce Exposure to data APIs More ❯
client requirements into scalable and secure data architectures Drive infrastructure-as-code and CI/CD deployment practices Process structured and semi-structured data (JSON, XML, Parquet, CSV) Maintain metadata, build data dictionaries, and ensure governance is embedded by design Work across industries in fast-paced, high-value engagements This Principal Data Engineer will bring: Extensive experience with ETL/… ELT pipelines and data transformation patterns Proficiency in AWS cloud services , particularly Redshift, Glue, Matillion, and S3 Strong command of data quality, data lineage, and metadata practices Fluency in database technologies (both relational and NoSQL) Experience with Linux environments and data visualisation tools (e.g. Tableau, QuickSight, Looker ) Bonus points for: Familiarity with Hadoop, Spark, or MapReduce Exposure to data APIs More ❯
to validate your technical skills. Automation Tools: Familiarity with automated testing and monitoring tools to ensure ongoing platform quality. Financial Services Knowledge: Experience in Salesforce implementations within Financial Services. Metadata & Tooling APIs: Experience using Salesforce Metadata API and Tooling API for DevOps or analytics purposes. Inner-Source Development: Familiarity with best practices in concurrent development and branching strategies within an More ❯
define requirements, refine solutions, and ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyze and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet More ❯
define requirements, refine solutions, and ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyse and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet More ❯
define requirements, refine solutions, and ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyze and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet More ❯
define requirements, refine solutions, and ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyze and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet More ❯
ETL/ELT using Matillion or similar tools. • Strong Python and SQL skills, with a good understanding of data formats • Demonstrated experience in designing data lakes, data pipelines, and metadata management. • Solid understanding of CI/CD pipelines, infrastructure automation, and agile delivery. • Background in consulting, including stakeholder engagement and client-facing delivery. • Proven success in responding to RFI/ More ❯
ETL/ELT using Matillion or similar tools. • Strong Python and SQL skills, with a good understanding of data formats • Demonstrated experience in designing data lakes, data pipelines, and metadata management. • Solid understanding of CI/CD pipelines, infrastructure automation, and agile delivery. • Background in consulting, including stakeholder engagement and client-facing delivery. • Proven success in responding to RFI/ More ❯
a related field Minimum of 10 years in a commercially led sales consulting role Strong understanding of data governance frameworks, best practices and trends Experience with data quality management, metadata management, and data linage Excellent communication and interpersonal skills Ability to work independently and as part of a team Knowledge of Varonis or Big ID Platforms Relevant certifications (e.g., CDMC More ❯
a related field Minimum of 10 years in a commercially led sales consulting role Strong understanding of data governance frameworks, best practices and trends Experience with data quality management, metadata management, and data linage Excellent communication and interpersonal skills Ability to work independently and as part of a team Knowledge of Varonis or Big ID Platforms Relevant certifications (e.g., CDMC More ❯
million-pound projects within consulting or enterprise-level engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or Looker. • Knowledge of CI/ More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
million-pound projects within consulting or enterprise-level engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or Looker. • Knowledge of CI/ More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
million-pound projects within consulting or enterprise-level engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or Looker. • Knowledge of CI/ More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Recruitment Revolution
build of high performance/high volume database architectures Highly proficient in TSQL including Stored Procedures, Views, Triggers, and UDF's Have in-depth knowledge of SQL Internal Architecture (Metadata, Indexes, Statistics etc.) Skilful in performance tuning (using all available tools/techniques), refactoring existing SQL, monitoring high availability clusters and patching live systems Experienced in SQL CLRs, SSRS, and More ❯