household names like 21 Savage, Ludacris, Arizona Zervas, Will Smith, Tom Waits, and more. DistroKid's expanding array of services includes music distribution, monetization, metadata customization, storage, and promotion. DistroKid's small staff, coupled with a largely automated backend, has enabled DistroKid to process over 10 million songs-while simultaneously More ❯
empower teams across the organization to make informed, data-driven decisions. This is a hands-on role that blends engineering excellence with product thinking, metadata management, and a passion for enabling self-service analytics and data literacy across the business. Key Responsibilities: Build & Maintain Scalable Data Warehousing Solutions: Design, build … around versioning, SLAs, data contracts, and quality validation Own the end-to-end lifecycle of key datasets, including documentation, testing, monitoring, and maintenance Use metadata, lineage, and usage metrics to ensure data products are trustworthy, discoverable, and valuable Enable Data Democratization & Self-Service: Build intuitive, well-structured data models to … data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflake schema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Exposure to data product management principles More ❯
tactical solutions to achieve long-term objectives and an overall data management roadmap. Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. Conduct data capacity planning, life More ❯
tactical solutions to achieve long-term objectives and an overall data management roadmap. Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. Conduct data capacity planning, life More ❯
design an interconnected data model supporting analysis across multiple datasets. Translate business and product requirements into clear, maintainable data modelling artifacts. Define and document metadata standards, entity relationships, and model schemas to support semantic alignment and discovery. Create tools and processes to monitor and maintain metadata inventories. Communicate data modelling … requirements to stakeholders, and drive alignment across metadata/modelling functions to ensure practices are well understood & followed. Perform data profiling and root cause analysis to guide objective, data-driven modelling decisions. Promote FAIR data principles across the modelling lifecycle. You'll need to have: Please note we use years … consider applications from all candidates who are able to demonstrate the skills necessary for the role. 4+ years of experience working with data modelling, metadata design, or semantic data structures Proven ability to work with messy, heterogeneous data sources and convert them into harmonized, queryable formats Strong communication skills, with More ❯
Familiarity with data compliance, security, and privacy standards in relation to DAM systems. Technical background with experience in digital content management, taxonomy standardisation, and metadata organisation. Ability to foster innovation by optimising content reuse, data collection, and reporting. A strong eye for design, with experience in graphic design and a More ❯
. We are currently building out a next generation data platform to provide seamless access to the BBC's large archive of programme information, metadata, subtitles, transcripts, production data and restrictions. Up until now, this metadata has been stored in a large number of separate, siloed systems, tightly coupled to … transform our platform and operations. Main Responsibilities Together with Lead Data Manager, you will: • Design Archives core data model to support search and discovery, metadata management and data driven decision making • Design efficient data processes and work with architects, engineers and analysts to implement them • Implement best practices in data More ❯
Data Domain. In addition, the Data Lead will be responsible for leading the thought leadership for one or more of the Data Management Specialisms (Metadata Management, Data Quality, Data Access Management, etc.). The Data Lead will be part of the global Data Governance Team in the CAO office of … Capital Markets. KEY RESPONSIBILITIES Data Governance Program Delivery Stream Lead one of the 9 delivery streams and bring priority datasets under governance by completing Metadata, Data Quality and Data Access & Sharing Management activities by working together with BAs, key business SMEs and Technology Leads. Metadata Management includes: Defining the in … RDAR) and jurisdictional regulations (Dodd Frank, EMIR, MiFID, CAT NMS, IIROC, etc) Experience with data management principles and practices, including but not limited to: Metadata Management, Data Quality Assessment, Data Access Management and Data Analytics. Strong understanding of data management techniques and technology. Good understanding of data structures and data More ❯
Delta Lake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience – Metadata, Data Quality, Lineage, Data Access Models Good understanding of Data Modelling concepts, Data Products and Data Domains Unity Catalog experience is a key differentiator – if … not then experience with a similar Catalog/Data Governance Management component MS Purview (Metadata and Data Quality tool) experience is a bonus – experience in similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Limited
with Delta Lake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience - Metadata, Data Quality, Lineage, Data Access Models Good understanding of Data Modelling concepts, Data Products and Data Domains Unity Catalog experience is a key differentiator - if … not then experience with a similar Catalog/Data Governance Management component MS Purview (Metadata and Data Quality tool) experience is a bonus - experience in similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark More ❯
london, south east england, United Kingdom Hybrid / WFH Options
83zero
Delta Lake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience – Metadata, Data Quality, Lineage, Data Access Models Good understanding of Data Modelling concepts, Data Products and Data Domains Unity Catalog experience is a key differentiator – if … not then experience with a similar Catalog/Data Governance Management component MS Purview (Metadata and Data Quality tool) experience is a bonus – experience in similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark More ❯
with Delta Lake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience - Metadata, Data Quality, Lineage, Data Access Models Good understanding of Data Modelling concepts, Data Products and Data Domains Unity Catalog experience is a key differentiator - if … not then experience with a similar Catalog/Data Governance Management component MS Purview (Metadata and Data Quality tool) experience is a bonus - experience in similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark More ❯
more predictive and powerful ways. We are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: Building a next-generation, metadata- and automation-driven data experience for GSK … required to deliver high quality products and user experiences (e.g. git/GitHub, Docker, DevOps tools, metrics/monitoring, etc.). Strong knowledge of metadata management frameworks and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark More ❯
more predictive and powerful ways. We are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: Building a next-generation, metadata- and automation-driven data experience for GSK … required to deliver high quality products and user experiences (e.g. git/GitHub, Docker, DevOps tools, metrics/monitoring, etc.). Strong knowledge of metadata management frameworks and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark More ❯
bbchr@bbc.co.uk. Job Introduction We are building a next-generation data platform for seamless access to the BBC’s large archive of programme information, metadata, subtitles, transcripts, production data, and restrictions. This role offers an exciting opportunity to be part of transforming our platform and operations. Main Responsibilities Design core … data models to support search, discovery, metadata management, and data-driven decision-making. Design efficient data processes and collaborate with architects, engineers, and analysts to implement them. Implement best practices in data governance and management across the Archives. Lay foundations for further development of the Archive data ecosystem and new … for data Desirable Skills Data governance experience Experience with other database types Ability to present data solutions to non-technical audiences Experience with media metadata Experience working in multidisciplinary teams and Agile environments About The BBC The BBC values diversity and is committed to equal opportunity employment. We prioritize redeployment More ❯
more predictive and powerful ways. We are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: Building a next-generation, metadata- and automation-driven data experience for GSK … required to deliver high quality products and user experiences (e.g. git/GitHub, Docker, DevOps tools, metrics/monitoring, etc.). Strong knowledge of metadata management frameworks and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark More ❯
more predictive and powerful ways. We are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: Building a next-generation, metadata- and automation-driven data experience for GSK … required to deliver high quality products and user experiences (e.g. git/GitHub, Docker, DevOps tools, metrics/monitoring, etc.). Strong knowledge of metadata management frameworks and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark More ❯
. Job Introduction We are building a next-generation data platform to provide seamless access to the BBC's large archive of programme information, metadata, subtitles, transcripts, production data, and restrictions. This role offers an exciting opportunity to be part of transforming our platform and operations. Main Responsibilities Alongside the … Lead Data Manager, you will: Design the core data model supporting search, discovery, metadata management, and data-driven decision-making. Design efficient data processes and collaborate with architects, engineers, and analysts to implement them. Implement best practices in data governance and management across the Archives. Lay the foundations for further More ❯
. Job Introduction We are building a next-generation data platform to provide seamless access to the BBC's large archive of programme information, metadata, subtitles, transcripts, production data, and restrictions. This role offers an exciting opportunity to be part of transforming our platform and operations. Main Responsibilities Alongside the … Lead Data Manager, you will: Design the core data model supporting search, discovery, metadata management, and data-driven decision-making. Design efficient data processes and collaborate with architects, engineers, and analysts to implement them. Implement best practices in data governance and management across the Archives. Lay the foundations for further More ❯
instruments etc.) Very experienced with data modelling, creating logical/conceptional data models with a good understanding of data model interaction with metamodels/metadata for data platforms. Expertise with programming in Python. Vast experience working with AWS solutions e.g. Glue and s3. Solid grasp of data governance/data … management concepts, including metadata management, master data management and data quality. Ideally, have experience with Data Lakehouse toolset (Iceberg) What you'll get in return Hybrid working (4 days per month in London HQ + as and when required) Access to market leading technologies What you need to do now More ❯
across various lines of business. Responsibilities: Work with stakeholders to understand key data concepts, relationships, and information needs. Develop conceptual and logical models and metadata solutions using UML and MagicDraw. Document model applications using Specification-By-Example for various use-cases. Design and maintain the information architecture blueprint, data integrations … experience is a plus. Deep understanding of enterprise data architecture and problem-solving. Hands-on modeling with MagicDraw (preferred), Erwin, or Enterprise Architect, including metadata management and forward-engineering. Knowledge of financial industry, Capital Markets, or Banking is advantageous. Understanding of Data Management methodologies, architecture, storage, and security. Experience with More ❯
enterprise-wide MDM initiatives. Develop end-to-end data lifecycle event flows, mapped to logical model and physical systems. Document and maintain data dictionaries, metadata, and entity relationships. Translate requirements into optimal data structures. Define and enforce data governance policies and standards. Design data models for structured and unstructured data … based databases (Snowflake, Amazon Redshift, Big Query, etc.). Experience working in financial services, capital markets, or consulting environments. Deep understanding of data governance, metadata management, and regulatory compliance. Proficiency with ETL frameworks, APIs, and real-time data integration. Why Lab49? Lab49 is an established partner for most financial institutions More ❯
code in various languages. Development Lifecycle : Understanding of SDLC, CI/CD pipelines, and version control. Data Governance & Security : Knowledge of data security, governance, metadata management, and master data principles. Key Responsibilities: As a Senior Data Engineer, you'll bridge the gap between client needs and technical solutions, creating data … scalability, and monitoring. Data Security : Apply best practices for information security, including encryption and data anonymity for sensitive data assets. Data Governance & Quality : Manage metadata, data lineage, and data quality standards. If you're passionate about using data engineering and AI to solve complex problems in the Defence and Security More ❯
is optimised for modelling and analysis. Supporting data scientists with clean, structured datasets tailored for model training and validation. Overseeing data QA processes and metadata tracking for deployed machine learning models. Leading the creation and upkeep of data documentation, including data dictionaries and version tracking protocols. Contributing to the team … in cloud-based environments, with exposure to tools such as AWS Redshift, Glue, S3, Athena, or Google BigQuery. Hands-on experience managing data versioning, metadata, and documentation for large-scale projects. A track record of collaborating with cross-functional teams, particularly in data science and product development contexts. Familiarity with More ❯
is optimised for modelling and analysis. Supporting data scientists with clean, structured datasets tailored for model training and validation. Overseeing data QA processes and metadata tracking for deployed machine learning models. Leading the creation and upkeep of data documentation, including data dictionaries and version tracking protocols. Contributing to the team … in cloud-based environments, with exposure to tools such as AWS Redshift, Glue, S3, Athena, or Google BigQuery. Hands-on experience managing data versioning, metadata, and documentation for large-scale projects. A track record of collaborating with cross-functional teams, particularly in data science and product development contexts. Familiarity with More ❯