e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. cloud/SAP)/data architecture Understanding of public and private More ❯
e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. cloud/SAP)/data architecture Understanding of public and private More ❯
is essential. 3+ years of experience managing resources and delivering projects. Skills & Expertise: Business data modelling & gap analysis IT testing & database reporting Workflow management & metadata management Strong analytical and problem-solving skills Expertise in JD Edwards (Advanced level) This is an excellent opportunity for a finance professional with systems expertise More ❯
stability. Version Control: Proficiency with GitHub for committing code, managing repositories, and maintaining a clean codebase. Data Management: Experience working with large datasets and metadata, preferably in a catalog or records management context. REQUIREMENTS: An ACTIVE and MAINTAINED TS/SCI with Polygraph 5-7 years of experience in software More ❯
leeds, west yorkshire, yorkshire and the humber, United Kingdom
Anson McCade
ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyze and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as More ❯
ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyse and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as More ❯
e.g., Lustre, IBM Spectrum Scale, NFS, GPFS) and storage solutions used in HPC environments to ensure efficient performance, scalability, and reliability Implement and manage metadata-driven systems for data labeling/tagging. This includes the development of strategies for classifying, indexing, and organizing datasets to enhance data discoverability, access control More ❯
or similar tools. • Strong Python and SQL skills, with a good understanding of data formats • Demonstrated experience in designing data lakes, data pipelines, and metadata management. • Solid understanding of CI/CD pipelines, infrastructure automation, and agile delivery. • Background in consulting, including stakeholder engagement and client-facing delivery. • Proven success More ❯
specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. A successful history of manipulating, processing, and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and More ❯
and provide software solution recommendations. Demonstrated experience in handling relational databases, methodologies & processes to manage large-scale databases and datasets. Understanding of addressing and metadata standards. Experience with other Data Tools such as PowerBI or Tableau. Advanced level of English is a must. More ❯
enhancing operational efficiency and reliability. Data Governance and Quality: Embed governance, security, and quality practices. Define access control, data lineage, and compliance standards. Promote metadata management and enforce standards to ensure data trustworthiness. Stakeholder Engagement: Collaborate with leadership and product owners to align data priorities and maximize value from data More ❯
governance, security, and data quality practices into engineering workflows. Define guardrails and reference implementations for data access control, data lineage, and compliance. Promote consistent metadata management and enforce technical standards to ensure trust in data assets. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D More ❯
sources. Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, AWS - Redshift, and OLAP technologies, Model data and metadata for ad hoc and pre-built reporting. Work with product tech teams and build robust and scalable data integration (ETL) pipelines using SQL, Python and More ❯
to create and read the associated diagrams. Experience in creating enterprise models incorporating master and reference data. Experience in creation of data dictionaries, defining metadata management approaches and governance processes. Experience in developing and implementing IT architecture plans, enterprise information architecture standard and guidelines, software development methodologies and strategic plans. More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Michael Page
will be responsible for designing and managing GPA's enterprise data models to support design and deployment of business systems: Design data models and metadata systems Help Chief Data Architects to interpret an organisation's needs Provide oversight and advice to other data architects who are designing and producing data More ❯
management of extremely large datasets. From Day 1, you will be challenged with a variety of tasks, ranging from creating datasets, reports, dashboards to metadata modeling, pipeline monitoring. You will interact with internal program and product owners, and technical teams to gather requirements, structure scalable and perform data solutions, and More ❯
GenAI may solve business challenges; Chain of thought reasoning to build up LLM based solutions; Resource Augmented Generation (RAG) including optimisation of chunking and metadata; Follow Responsible AI principles; Rapidly develop customer demonstrations to show the art of the possible with GenAI; Contribute to Ascent's Gen AI methodology and More ❯
or other mechanisms that guide analytic efforts. Read, extract and plot coordinates using multiple maps and chart scales, projects, and datums. Provide and create metadata to support structured observations in support of the ODNI's OBP initiatives. Provide imagery analysis services on orders of battle, battle damage assessment, and other More ❯
the financial community. Our News Platform aggregates content from Reuters, and thousands of other news providers, augmenting our content with a rich set of metadata and analytics to enable our customers in finding News most relevant to their needs. The News we deliver informs investors, moves markets, and enables companies More ❯
platforms. Maintain and optimize systems like the Army Intelligence Data Platform and Distributed Common Ground System-Army (DCGS-A) for real-time access. Implement metadata tagging, indexing, and search optimization for data discoverability. Collaboration and Workflow Optimization: Facilitate collaboration among analysts, planners, and commanders using tools like SharePoint, Microsoft Teams More ❯
or other mechanisms that guide analytic efforts. Read, extract and plot coordinates using multiple maps and chart scales, projects, and datums. Provide and create metadata to support structured observations in support of the ODNI's OBP initiatives. Provide imagery analysis services on orders of battle, battle damage assessment, and other More ❯
relationships with IC/DoD partners on a daily basis, ensuring effective and efficient progress on projects. • Conduct operational research from various sources/metadata to identify, develop, and find mission-related targets. • Develop tailored access and exploitation strategies against targets of interest. • Create weaponeering solutions with known capabilities (existing More ❯
and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ML teams to enable model training pipelines with robust and reliable data access. Leverage metadata and structured data modeling to support AI model explainability and audit trails. Guide engineering teams on best practices for cloud-based data handling, Terraform, and More ❯
between (SFDC) Sandboxes and Production Orgs using Change Sets. Experience with Release Management, Source Control, and Deployment concepts, and technologies such as ANT, SFDC Metadata API, GitHub, Azure DevOps CI/CD Pipelines. Ability to lead design sessions and communicate out-of-thebox and custom design options to Government customers More ❯
to gather use case requirements, advise on how to unlock more value, prioritise and translate them into technical data requests. Identify user data, content metadata, and campaign data to ingest into the CDP, and how this data delivers audience targeting, campaign optimization, and post-campaign measurement. Maintain and build new More ❯