SR2 | Socially Responsible Recruitment | Certified B Corporation™
across the ecosystem. Key Responsibilities: Define and document conceptual, logical, and physical data models for geospatial and land data domains using EA Sparx Establish data standards, attribute schemas, and metadata frameworks to drive consistency across a common spatial framework Design data management solutions that support both batch and streaming processes, with clear handling of current and historical data Conduct gap … structures Experience Required: Proven experience as a Data Architect within geospatial or land data environments (public sector preferred) Expert in EA Sparx for modelling and documentation Strong knowledge of metadata standards (ISO 19115, INSPIRE, GEMINI) and data governance frameworks Understanding of coordinate reference systems, topology, and geometry validation Experience conducting data quality, completeness, and consistency assessments Strong communication and collaboration More ❯
Technical Business Analyst - Derived Pricing London - Hybrid (3 days on-site) 6 Months UMBRELLA only Role overview We are currently seeking an ambitious individual to join our clients team as Technical Business Analyst for Derived Pricing working together with colleagues More ❯
Job Role Architecture & Solution Design Define end-to-end MAM architecture, including ingest, storage, metadata, workflows, search and distribution. Design integration patterns between MAM and DAM/CMS, OTT platforms, storage, CDN, and analytics systems. Define live video workflows from contribution → encoding → packaging → CDN → playback. Define the end-to-end search architecture across the solution. Translate business requirements into technical … blueprints and implementation plans. Drive performance, scalability, and security improvements for deployments. Development & Implementation Build custom workflows, plugins, and APIs.. Develop automation solutions for ingest, transcoding, metadata enrichment, QC, and archiving. Develop and maintain live video pipelines: ingest → encoding → packaging → delivery Build and consume APIs for live media services (AWS MediaLive/MediaConnect, Azure Media Services, Wowza, etc.) Implement and … search engines (Elasticsearch, Solr, OpenSearch, or vendor-native). Implement API-driven integrations with third-party systems (e.g., Adobe, Avid, broadcast systems, DAMs, cloud storage). Configure user access, metadata schemas, and distribution workflows. Contribute to CI/CD pipelines, containerized deployments, and monitoring setup. Leadership & Collaboration Provide technical leadership to developers and operations teams. Partner with product managers, media More ❯
Job Role Architecture & Solution Design Define end-to-end MAM architecture, including ingest, storage, metadata, workflows, search and distribution. Design integration patterns between MAM and DAM/CMS, OTT platforms, storage, CDN, and analytics systems. Define live video workflows from contribution → encoding → packaging → CDN → playback. Define the end-to-end search architecture across the solution. Translate business requirements into technical … blueprints and implementation plans. Drive performance, scalability, and security improvements for deployments. Development & Implementation Build custom workflows, plugins, and APIs.. Develop automation solutions for ingest, transcoding, metadata enrichment, QC, and archiving. Develop and maintain live video pipelines: ingest → encoding → packaging → delivery Build and consume APIs for live media services (AWS MediaLive/MediaConnect, Azure Media Services, Wowza, etc.) Implement and … search engines (Elasticsearch, Solr, OpenSearch, or vendor-native). Implement API-driven integrations with third-party systems (e.g., Adobe, Avid, broadcast systems, DAMs, cloud storage). Configure user access, metadata schemas, and distribution workflows. Contribute to CI/CD pipelines, containerized deployments, and monitoring setup. Leadership & Collaboration Provide technical leadership to developers and operations teams. Partner with product managers, media More ❯
and modular ETL components and frameworks. Conduct code reviews and enforce best practices in ETL development. Troubleshoot and resolve production issues related to Ab Initio jobs. Maintain and manage metadata using EME. Required Skills: 5+ years of Ab Initio development experience. Strong understanding of ETL concepts, data warehousing, and data modeling. Hands-on experience with Ab Initio GDE, Co>Operating … System, EME, Conduct>It, Continuous Flows, Express>It, and Metadata Hub. Proficiency in SQL, Unix/Linux shell scripting, and performance tuning. Familiarity with job schedulers like Control-M or similar. Experience working with RDBMS (e.g., Oracle, Teradata, DB2, PostgreSQL). Strong problem-solving and debugging skills. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Crimson
and implementation of robust design guardrails, standards, and policies that balance both functional and non-functional requirements while managing relevant risks in system delivery. Demonstrate expertise in data models, metadata management, and data dictionaries. Possess in-depth knowledge of data systems and architectures, with an understanding of best practices for data management and maintenance. Utilize multiple data modelling and design … tools and methodologies. Stay abreast of advancements in digital information technology and their potential applications. Apply advanced analytics practices and methodologies. Proven experience in designing data models and metadata systems. Skilled at interpreting organizational needs and translating them into effective data solutions. Adept at providing oversight and expert guidance to data architects involved in the design and production of data More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
and implementation of robust design guardrails, standards, and policies that balance both functional and non-functional requirements while managing relevant risks in system delivery. Demonstrate expertise in data models, metadata management, and data dictionaries. Possess in-depth knowledge of data systems and architectures, with an understanding of best practices for data management and maintenance. Utilize multiple data modelling and design … tools and methodologies. Stay abreast of advancements in digital information technology and their potential applications. Apply advanced analytics practices and methodologies. Proven experience in designing data models and metadata systems. Skilled at interpreting organizational needs and translating them into effective data solutions. Adept at providing oversight and expert guidance to data architects involved in the design and production of data More ❯
City Of London, England, United Kingdom Hybrid/Remote Options
Crimson
and implementation of robust design guardrails, standards, and policies that balance both functional and non-functional requirements while managing relevant risks in system delivery. Demonstrate expertise in data models, metadata management, and data dictionaries. Possess in-depth knowledge of data systems and architectures, with an understanding of best practices for data management and maintenance. Utilize multiple data modelling and design … tools and methodologies. Stay abreast of advancements in digital information technology and their potential applications. Apply advanced analytics practices and methodologies. Proven experience in designing data models and metadata systems. Skilled at interpreting organizational needs and translating them into effective data solutions. Adept at providing oversight and expert guidance to data architects involved in the design and production of data More ❯
and Responsibilities Data & Retrieval Build robust ingestion pipelines for PDFs/Word/Excel/Audio/JSON and semi-structured sources. Design RAG systems: chunking strategies, document schemas, metadata, hybrid/dense retrieval, re-ranking, and grounding. Manage vector/keyword indexes (e.g., Azure AI Search, pgvector, Pinecone/Weaviate). Develop and deploy advanced NLP, information retrieval, and … that enhance Chambers and Partners’ research and product offerings, including document understanding, automatic summarisation, topic modelling, semantic search, entity recognition, and relationship extraction. Design and implement intelligent tagging and metadata enrichment frameworks to categorize and organize legal and market data, improving search, discoverability, and insight accuracy. LLM & Machine Learning Application Engineering Design, build, and maintain traditional ML and LLM models More ❯
Main Duties and Responsibilities Build robust ingestion pipelines for PDFs/Word/Excel/Audio/JSON and semi structured sources. Design RAG systems: chunking strategies, document schemas, metadata, hybrid/dense retrieval, re ranking, and grounding. Manage vector/keyword indexes (e.g., Azure AI Search, pgvector, Pinecone/Weaviate). Develop and deploy advanced NLP, information retrieval, and … that enhance Chambers and Partners' research and product offerings, including document understanding, automatic summarisation, topic modelling, semantic search, entity recognition, and relationship extraction. Design and implement intelligent tagging and metadata enrichment frameworks to categorize and organize legal and market data, improving search, discoverability, and insight accuracy. LLM & Machine Learning Application Engineering Design, build and maintain traditional ML and LLM models More ❯
London, England, United Kingdom Hybrid/Remote Options
Client Server
Senior Data Engineer (SQL BigQuery GCP) London/WFH to £110,000 Are you a data technologist with Media Streaming experience? You could be progressing your career in a senor, hands-on role at one of Europe's most successful More ❯
Data Management Analyst Location: Remote Salary: £50,000 - £55,000 per annum Closing Date: 04 December 2025 At Stonewater, data isn’t just numbers on a screen – it’s the heartbeat of how we drive impact, improve lives, and achieve More ❯
Data Management Analyst Location: Remote Salary: £50,000 - £55,000 per annum Closing Date: 04 December 2025 At Stonewater, data isn’t just numbers on a screen – it’s the heartbeat of how we drive impact, improve lives, and achieve More ❯
impact, and data classification for all clients Power Apps/Flows/Agents/Pages. Reassign ownership for orphaned Power Apps/Flows/Agents/Pages and apply metadata tags. Outcome: Complete ownership registry and metadata tagging for all active Power Apps/Flows/Copilots/Pages Policy Compliance Assessment Evaluate Power Apps/Flows/Agents/ More ❯
impact, and data classification for all clients Power Apps/Flows/Agents/Pages. Reassign ownership for orphaned Power Apps/Flows/Agents/Pages and apply metadata tags. Outcome: Complete ownership registry and metadata tagging for all active Power Apps/Flows/Copilots/Pages Policy Compliance Assessment Evaluate Power Apps/Flows/Agents/ More ❯
Contact: (share updated CV) Responsibilities Design and implement a federated Data Mesh architecture for Research Informatics Architect the end-to-end data mesh platform using AWS, Snowflake, Palantir, and metadata tools Define CDEs and data product contracts for domains (Biology, Chemistry, Omics) Establish federated governance and metadata lineage frameworks Partner with domain experts to design FAIR and reusable data products … Drive adoption among scientists through integrations with GraphPad, ELN, and other lab tools Required Skills Strong knowledge of AWS S3, Snowflake, Palantir Foundry, Databricks Proven experience in data architecture, metadata modeling, and data product design Familiarity with FAIR data principles, Life Sciences workflows, and ontologies Excellent stakeholder and communication skills across scientific and IT functions Other Details Seniority level: Mid More ❯
Luton, Bedfordshire, United Kingdom Hybrid/Remote Options
Experis
pipelines using Python and pandas * Writing SQL queries to extract and manipulate relational data * Implementing data validation and quality assurance processes * Working with JSON, XML, and CSV formats * Supporting metadata cataloging and reference data management * Learning and applying RDF and semantic web concepts * Collaborating with subject matter experts on data requirements * Clean, validated datasets ready for semantic processing * Python scripts … for transformation and validation * SQL queries and database integration code * Data quality reports and documentation * Contributions to metadata catalogs and RDF processing * Pipeline documentation and ad hoc data analysis People Source Consulting Ltd is acting as an Employment Agency in relation to this vacancy. People Source specialise in technology recruitment across niche markets including Information Technology, Digital TV, Digital Marketing More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid/Remote Options
Experis UK
pipelines using Python and pandas * Writing SQL queries to extract and manipulate relational data * Implementing data validation and quality assurance processes * Working with JSON, XML, and CSV formats * Supporting metadata cataloging and reference data management * Learning and applying RDF and semantic web concepts * Collaborating with subject matter experts on data requirements * Clean, validated datasets ready for semantic processing * Python scripts … for transformation and validation * SQL queries and database integration code * Data quality reports and documentation * Contributions to metadata catalogs and RDF processing * Pipeline documentation and ad hoc data analysis People Source Consulting Ltd is acting as an Employment Agency in relation to this vacancy. People Source specialise in technology recruitment across niche markets including Information Technology, Digital TV, Digital Marketing More ❯
bedford, east anglia, united kingdom Hybrid/Remote Options
Experis UK
pipelines using Python and pandas * Writing SQL queries to extract and manipulate relational data * Implementing data validation and quality assurance processes * Working with JSON, XML, and CSV formats * Supporting metadata cataloging and reference data management * Learning and applying RDF and semantic web concepts * Collaborating with subject matter experts on data requirements * Clean, validated datasets ready for semantic processing * Python scripts … for transformation and validation * SQL queries and database integration code * Data quality reports and documentation * Contributions to metadata catalogs and RDF processing * Pipeline documentation and ad hoc data analysis People Source Consulting Ltd is acting as an Employment Agency in relation to this vacancy. People Source specialise in technology recruitment across niche markets including Information Technology, Digital TV, Digital Marketing More ❯
Birmingham, West Midlands, United Kingdom Hybrid/Remote Options
VANLOQ LIMITED
technology teams to embed sustainable, scalable, and compliant data practices . Key Responsibilities Define, implement, and maintain data governance policies, processes, and reporting structures . Facilitate governance initiatives, including metadata management, stewardship, and ownership models . Design and deliver reporting frameworks that track data quality, risk, and control metrics. Collaborate with client stakeholders to promote data literacy and embed governance … the selection, design, and rollout of governance, quality, and reporting tooling solutions . About You Strong track record in data governance and management within financial services. Deep understanding of metadata, lineage, stewardship, and data ownership frameworks . Experienced in developing and deploying governance and quality reporting frameworks . Excellent communicator capable of explaining complex governance concepts to senior stakeholders . More ❯
Skills and Experience: Proven experience designing geospatial data architectures in large enterprise or public sector environments. Proficiency with EA Sparx for data modelling and architecture documentation. Deep knowledge of metadata and geospatial data standards (ISO 19115, INSPIRE and GEMINI). Understanding of coordinate reference systems, topology, geometry validation and data lineage. Experience conducting data quality and completeness assessments and implementing … governance model. Role and Responsibilities: Develop conceptual, logical, and physical data models for geospatial and land data using EA Sparx for design and documentation. Define data standards, schemas and metadata frameworks to ensure consistency and interoperability across the organisation. Design solutions that distinguish between live, validated and historical datasets while supporting both batch and streaming-based processing. Conduct gap analyses More ❯
and log benefits in the approved Ariba workflow. Data stewardship in Ariba (single source of truth): Champion data quality across sourcing projects, contracts and savings forms-owning taxonomy alignment, metadata completeness and timeliness (e.g., renewal dates, values, clauses). Use the latest upload templates and required fields; fix gaps rapidly to keep reporting dependable. Produce category MI from, SpendViz and … pipeline covering new events and all renewals 6-12 months ahead. Data Quality excellence within Ariba: 100% of in scope contracts loaded in a timely manner with complete, correct metadata; sourcing projects and savings forms kept current; reporting is "board ready". Value & risk: Achieve agreed savings/avoidance targets to support overall Sourcing team savings targets; all material suppliers … Ideally to have hands on with Ariba Sourcing & Contracts (or equivalent S2P), or a commitment to learn and work within Ariba, with a clear data stewardship mindset-comfortable owning metadata, templates, and reporting to drive decisions. Solid understanding of supplier risk workflows and partnering with Legal, InfoSec, Privacy and BCM. Strategic and analytical thinker who converts insight into pragmatic commercial More ❯
and Responsibilities Data & Retrieval Build robust ingestion pipelines for PDFs/Word/Excel/Audio/JSON and semi-structured sources. Design RAG systems : chunking strategies, document schemas, metadata, hybrid/dense retrieval, re-ranking, and grounding. Manage vector/keyword indexes (e.g., Azure AI Search , pgvector, Pinecone/Weaviate). Develop and deploy advanced NLP, information retrieval, and … that enhance Chambers and Partners’ research and product offerings, including document understanding, automatic summarisation, topic modelling, semantic search, entity recognition, and relationship extraction. Design and implement intelligent tagging and metadata enrichment frameworks to categorize and organize legal and market data, improving search, discoverability, and insight accuracy. LLM & Machine Learning Application Engineering Design, build, and maintain traditional ML and LLM models More ❯
is at the core of modern business, yet many teams struggle with its overwhelming volume and complexity. At Atlan, we're changing that. As the world's first active metadata platform, we help organisations transform data chaos into clarity and seamless collaboration. From Fortune 500 leaders to hyper growth startups, from automotive innovators redefining mobility to healthcare organisations saving lives … data community. If you're a strategic thinker with deep expertise in data and a knack for delivering results, we want you on our team! Impact & Purpose Lead the Metadata Charge: Guide data teams toward better collaboration and governance in the Active Metadata Management revolution. Partner with Industry Titans: Become a trusted advisor to Atlan's customers, including Fortune … companies and innovative startups. Deliver Measurable Results: Directly impact customer ROI by helping them achieve data driven success through effective metadata management. Shape the Future: Make your voice heard! Influence the evolution of Atlan's solutions with your insights and expertise. Your mission at Atlan Trusted Advisor: Uncover your customers' unique needs and propose strategic solutions to maximise value from More ❯
programme. Key Responsibilities Lead technical architecture and solution design for data products and open data services Provide design leadership across a multi-disciplinary team Apply expertise in data cataloguing, metadata management, and open/licensed data portals Leverage understanding of AI developments to shape and enhance data services Work with open source tools and cloud infrastructure (AWS, Terraform) Guide technical … of 7-8 and a wider programme of 50 Essential Skills & Experience Proven experience with data products and open/licensed data portals Knowledge of data cataloguing systems and metadata management Familiarity with CKAN or similar platforms Strong solution architecture and technical design capabilities Experience leading multi-disciplinary teams Cloud and infrastructure experience (AWS, Terraform) Programming skills in Python or More ❯