Ensure compliance with EA frameworks (e.g., TOGAF, DoDAF) and data governance policies. • Participate in architecture reviews, assessments, and documentation of current and target state architectures. • Assist with data integration, metadata management, and system interoperability initiatives. • Develop visualizations and reports to communicate architecture concepts to technical and non-technical audiences. Required Qualifications: • Bachelor's degree in Computer Science, Information Systems, Engineering … on experience with EA modeling tools (e.g., Sparx EA, System Architect, MEGA). • Knowledge of architecture frameworks (e.g., TOGAF, DoDAF, FEAF). • Proficiency in data modeling techniques, standards, and metadata management. • Strong analytical, communication, and documentation skills. Preferred Qualifications: • TOGAF or similar Enterprise Architecture certification. • Experience in federal or large enterprise environments. • Familiarity with cloud architectures, data lakes, or integration More ❯
Monitor industry trends, competitor performance, and audience preferences to support media product innovation Workflow Optimization: Identify and propose improvements to workflows that handle rights data, ensuring accurate and timely metadata entry, validation, and reporting Data Analysis: Analyze large data sets related to content usage and rights to identify trends, gaps, and opportunities for automation or efficiency Stakeholder Collaboration: Bridge communication … into specific enterprise systems like financial platforms and products, royalties, and contract systems for reporting and compliance Ensure seamless integration between MAM/DAM systems, scheduling tools, AI models, metadata pipelines, and streaming infrastructure Familiarity with media platforms (YouTube, TikTok, streaming services) and digital advertising metrics Proficiency with Agile tools like JIRA, Confluence, Figma, Airtable, and Lucidchart Understanding of cloud … infrastructure, media file formats, and metadata standards is a plus Change management and transformation experience Preferred Qualifications Bachelor's degree in media, Computer Science, Business, or a related field. Advanced degrees or certifications in AI or media product development are a bonus Background in media law, intellectual property, or digital asset management (DAM) is a plus Certification in Product Management More ❯
is onsite at our offices in Seattle, WA. Key Responsibilities Design and implement backend services (microservices, REST/GraphQL APIs, event-driven systems) to power core features: content ingestion, metadata, user profiles, personalization, video metadata, recommendations, etc. Build scalable, highly available systems that handle high throughput and low latency requirements (e.g. for live events, real-time analytics). Work with … media/video teams to integrate backend systems for streaming, transcoding, content delivery (CDN), metadata synchronization, DRM, ad insertion. Define data models, database schemas, and caching strategies to support queries at scale. Instrument services with monitoring, metrics, tracing, alerting, and health checks; own reliability and observability for your services. Collaborate with mobile/front-end/hybrid app engineers to … skills Excellent collaboration skills - you'll be working across domains (mobile, video, data) in a small, cross-functional team Preferred Qualifications Experience with media streaming back ends (e.g. video metadata, packaging, CDN integration, manifest generation, DRM, ad insertion) Familiarity with cloud infrastructure (AWS, GCP, Azure), serverless patterns, microservices and containerization (Docker, Kubernetes) Experience with large-scale data pipelines, ETL, analytics More ❯
As a Data Architect, you will: Design, support and guide the upgrade, management, and archiving of data in line with data policies. Define and maintain data technology architecture, covering metadata, integration, BI, and data warehousing. Support and contribute to data dictionaries and metadata repositories. Drive alignment between data design and business needs, ensuring compliance with governance standards. Essential Skills & Experience … engineer data models Align data architecture to business problems and enterprise-wide standards 🔸 Data Governance & Standards Ensure compliance with data policies Develop, assess and enforce data standards 🔸 Data Integration & Metadata Management Perform impact analysis for data and system integration Maintain accurate metadata repositories 🔸 Data Analysis & Communication Perform data profiling and analysis of source systems Present insights clearly to technical and More ❯
As a Data Architect, you will: Design, support and guide the upgrade, management, and archiving of data in line with data policies. Define and maintain data technology architecture, covering metadata, integration, BI, and data warehousing. Support and contribute to data dictionaries and metadata repositories. Drive alignment between data design and business needs, ensuring compliance with governance standards. Essential Skills & Experience … engineer data models Align data architecture to business problems and enterprise-wide standards 🔸 Data Governance & Standards Ensure compliance with data policies Develop, assess and enforce data standards 🔸 Data Integration & Metadata Management Perform impact analysis for data and system integration Maintain accurate metadata repositories 🔸 Data Analysis & Communication Perform data profiling and analysis of source systems Present insights clearly to technical and More ❯
We are seeking a Dataflow Engineer to architect, implement, and manage data movement and transformation pipelines across enterprise-level environments. This role requires a deep understanding of data modeling, metadata management, and data governance principles within a secure and compliant environment. The successful candidate will ensure data flows adhere to Data Management Requirements (DMRs), utilize Enterprise Data Headers, and enforce … ingestion, transformation, and storage. • Ensure adherence to Data Management Requirements (DMRs) across all data engineering processes, supporting auditability and governance. • Implement and integrate Enterprise Data Headers (EDH) for enhanced metadata tagging, traceability, and cross-system interoperability. • Apply and maintain Attribute-Based Access Control (ABAC) mechanisms to enforce data protection and dissemination rules based on user roles, attributes, and mission need. … Computer Science, Data Science, Information Systems, or a related field. • Expertise in data modeling, relational and non-relational databases (SQL, NoSQL), and schema design. • Demonstrated experience with DMRs and metadata-driven data governance processes. • Hands-on knowledge of Enterprise Data Headers (EDH) implementation and tagging mechanisms. • Proven understanding and implementation of ABAC using policy enforcement points and decision engines (e.g. More ❯
provide guidance for the upgrade, management, decommission and archive of data in compliance with data policy Provide input into data dictionaries Define and maintain the data technology architecture, including metadata, integration and business intelligence or data warehouse architecture Skills at Working Level Communicate across disciplines : Effectively engage with both technical and non-technical stakeholders, manage team dynamics, and represent the … compliance measures. Model data : Explain data modelling principles, create and maintain models, and reverse-engineer from live systems. Implement data standards : Develop standards, assess compliance, and analyze breaches. Manage metadata : Use repositories for complex tasks and maintain accurate metadata. Solve problems : Investigate issues, consult experts, implement remedies, and suggest preventive actions. Design data architecture : Translate business problems into data designs More ❯
insights from structured and unstructured data, supporting decision-making across DoD and corporate domains. • Develop and integrate AI/ML solutions to enhance records management practices, including classification review, metadata tagging, digitization workflows, and compliance with DoD and Federal records requirements. • Ensure AI/ML solutions align with and implement relevant DoD policies, rules, and regulations (e.g., RMF, DoDI 5015.02 … development and data analytics. • Proficiency in Python and AI/ML frameworks (TensorFlow, PyTorch, scikit-learn, Hugging Face, etc.). • Familiarity with DoD and Federal records management policies, including metadata standards. • Proven experience developing AI solutions in compliance with DoD rules, regulations, and security frameworks. • Strong communication skills with the ability to translate technical outputs into actionable insights for senior More ❯
level conceptual and logical models that facilitate a cross-system/cross functional view of data requirements and establishing processes for governing the identification, collection, and use of corporate metadata and takes steps to assure metadata accuracy, validity, and quality. The Data Architect (Senior) will design and build relational and non-relational databases and translate business needs into long-term … architecture solutions. The Data Architect (Senior) reviews object and data models and the metadata repository to structure the data for better management and quicker access. Responsibilities: Designs and develops solutions to complex data sets. Leads the development of enterprise data architectures, strategies, and solution recommendations in support of business strategies. Leads design of high-level conceptual and logical models that … facilitate a cross-system/cross functional view of data requirements. Establishes processes for governing the identification, collection, and use of corporate metadata and takes steps to assure metadata accuracy, validity, and quality. Provides leadership to system and local work groups. Designs and builds relational and non-relational databases. Translates business needs into long-term architecture solutions. Reviews object and More ❯
Arlington, Virginia, United States Hybrid / WFH Options
CGI
information assets. • Cross-functional collaboration: Partner with data engineers, data scientists, product managers, and subject matter experts to understand business needs and translate them into effective data classification schemes. • Metadata management: Establish and enforce metadata standards and tagging best practices to ensure consistent and accurate data application by content creators. • Governance: Define and lead taxonomy and ontology governance processes, including … ontology functionality into new and existing systems, including content management systems (CMS), digital asset management (DAM), and knowledge graphs. • Training and support: Educate and train staff on taxonomy principles, metadata standards, and best practices to promote a culture of effective data organization. • Maintenance and auditing: Conduct regular audits and analysis to identify inconsistencies and refine existing taxonomies and ontologies for More ❯
Python would be preferred., Extensively worked on Teradata or Hadoop as database using Ab Initio as ETL tool for large scale data integration., Good understanding of data warehouse and Metadata management concepts and Tools., Good Knowledge in establishing the data lineage in Ab Initio Metadata Hub., Ability to Lead and Manage a small team of Ab Initio developers both Onshore More ❯
solving Collaborate, test, and assist others in development and use of geospatial standards. Responsible for progressing and maintaining official description of enterprise datasets Support the capture and management of metadata and information sources from across the enterprise, including data sets, business intelligence reports, visualizations, and conversations. Work with GEOINT standards Data Modeling languages to support the life cycle management of … evolving data standards supporting AGENCY and NSG initiatives and acquisition programs. Applying naming standards, and metadata standards, checks models in and out of Model Repository, and documents data model translation decisions Support updating and managing MySQL, MS-ACCESS structures (Table, Columns) and content Experience in the use of existing OGC standards and industry best practices related visualization, features, spatiotemporal data … metadata catalogs, and coordinate reference systems. Working knowledge of the following software language, GitLab or GitHub, HTTP REST API's, XML and JSON. Working knowledge of a logical query expression such as SQL, XML or JSON filters Skills and Experience: Required Skills and Tasks: Active TS/SCI clearance Experienced technical writer Working knowledge using MySQL or other relational databases More ❯
adapting the ETL process to handle new data types, and transferring data between systems or networks. Experience in organizing and cataloging datasets, suggesting a need for data governance and metadata management skills. Ability to communicate effectively with diverse stakeholders, both technical and non-technical, across different seniority levels. Desired Skills & Certifications: Experience with Hadoop technologies and integrating them into ETL … Extract, Transform, Load) data pipelines. Familiarity with Apache Tika for metadata and text extraction from various document types. Experience as a Data Layer Architect, focusing on fusing hybrid data sources into a common model. A Master's degree or equivalent experience in Computer Science is preferred Proven ability to collaborate effectively within a team and work independently to deliver results. More ❯
Overview Please note that this position is contingent upon the successful award of a contract currently under bid. Goldbelt Nighthawk offers sound solutions in software development and both defensive and proactive cybersecurity. Nighthawk offers an integrated, holistic cybersecurity workforce that More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
Senior Data Engineer (SQL BigQuery GCP) London/WFH to £110,000 Are you a data technologist with Media Streaming experience? You could be progressing your career in a senor, hands-on role at one of Europe's most successful More ❯
solving Collaborate, test, and assist others in development and use of geospatial standards. Responsible for progressing and maintaining official description of enterprise datasets Support the capture and management of metadata and information sources from across the enterprise, including data sets, business intelligence reports, visualizations, and conversations. Work with GEOINT standards Data Modeling languages to support the life cycle management of … evolving data standards supporting AGENCY and NSG initiatives and acquisition programs. Applying naming standards, and metadata standards, checks models in and out of Model Repository, and documents data model translation decisions Support updating and managing MySQL, MS-ACCESS structures (Table, Columns) and content Experience in the use of existing OGC standards and industry best practices related visualization, features, spatiotemporal data … metadata catalogs, and coordinate reference systems. Working knowledge of the following software language, GitLab or GitHub, HTTP REST API's, XML and JSON. Working knowledge of a logical query expression such as SQL, XML or JSON filters Skills and Experience: Required Skills and Tasks: Active TS/SCI clearance Experience technical writer. Working knowledge using MySQL or other relational databases More ❯
to meet requirements. Essential Functions: • Key understanding of Data Lineage/Cataloging/Reconciliation to be the Data Management SME. • Data storage approaches (database, object, blockchain, etc.) • Data cataloging (metadata management, discovery, linage, governance) • Data validation (apply field level policies to form the latest data object as new data is received) • Data reconciliation (apply field level policies to form the … approaches, including relational databases (e.g., SQL), object storage (e.g., AWS S3), and blockchain technology. • Practical experience in implementing and managing different types of data storage solutions. • Expertise in managing metadata, including creation, maintenance, and usage. • Ability to track and document the flow of data from source to destination. • Knowledge of data governance practices and frameworks to ensure data quality and More ❯
vendors to ensure seamless operations and user satisfaction. Creates and maintains documentation, user guides, and knowledge base articles to support the Exchange environment. Experience with data modeling, data dictionaries, metadata management, and data governance processes. Knowledge of IT architecture principles, standards, and guidelines. Experience in leading and participating in information/data management projects. Identifies opportunities for improvement in the … Exchange environment and implements solutions to enhance performance and reliability. Required Qualifications Clearance: TS/SCI Working knowledge of SharePoint metadata, application data fields, Active Directory, Exchange, Databases, and other systems and services Understanding of both physical and virtual infrastructure hosting one or more Operating Systems: Windows, Linux, Solaris, UNIX, and/or AIX systems Must have a current DoD More ❯
and log benefits in the approved Ariba workflow. Data stewardship in Ariba (single source of truth): Champion data quality across sourcing projects, contracts and savings forms—owning taxonomy alignment, metadata completeness and timeliness (e.g., renewal dates, values, clauses). Use the latest upload templates and required fields; fix gaps rapidly to keep reporting dependable. Produce category MI from, SpendViz and … pipeline covering new events and all renewals 6–12 months ahead. Data Quality excellence within Ariba: 100% of in-scope contracts loaded in a timely manner with complete, correct metadata; sourcing projects and savings forms kept current; reporting is “board-ready”. Value & risk: Achieve agreed savings/avoidance targets to support overall Sourcing team savings targets; all material suppliers … Ideally to have hands-on with Ariba Sourcing & Contracts (or equivalent S2P), or a commitment to learn and work within Ariba, with a clear data stewardship mindset—comfortable owning metadata, templates, and reporting to drive decisions. Solid understanding of supplier risk workflows and partnering with Legal, InfoSec, Privacy and BCM. Strategic and analytical thinker who converts insight into pragmatic commercial More ❯
MuleSoft), platform events, and change data capture, while adhering to Salesforce integration best practices. Create and maintain secure, scalable data models, applying Salesforce data modeling principles-object relationships, custom metadata, external objects, and field-level security. Establish governance around security architecture, including SSO, OAuth flows, API security, sharing & visibility models, and platform encryption. Provide hands-on development expertise, writing efficient … depth knowledge of security architecture, including OAuth flows, SSO, role hierarchy, sharing rules, FLS, and Shield Platform Encryption. Platform Tools & DevOps Familiarity with tools like Salesforce DX, Unlocked Packages, Metadata API, and CI/CD pipelines using GitHub, Jenkins, or Copado. CRM Analytics/Einstein Analytics/Reports & Dashboards Experience in building advanced analytics, KPIs, and AI-powered insights within More ❯
Washington, Washington DC, United States Hybrid / WFH Options
TekSynap
align with and support organizational objectives. Create and execute strategies for data acquisition, storage, recovery, and overall database implementation. Work within data warehouse environments, focusing on data modeling, architecture, metadata, and repository development. Convert business requirements into scalable, long-term data architecture solutions. Define, design, and construct dimensional databases while producing detailed data warehousing blueprints. Assess the reusability of existing … data assets and identify opportunities for expanded analytical applications. Review and refine object models, data models, and metadata repositories to enhance data structures for efficient access and management. REQUIRED QUALIFICATIONS Minimum of five (5) years of relevant experience in data architecture, design, and implementation. Experience working with modeling tools such as ER/Win, Power Designer, ERStudio, or similar. Relevant More ❯
an executive and enterprise audience. Monitors and reports on usage metrics of the data catalog and information sharing requests to inform improvements Identifies gaps in data documentation and catalog metadata quality and proposes actions to close them Contributes to knowledge management deliverables for enterprise data projects Creates and maintains repositories for solutions and best practices to share with the client … absence of years of experience, certifications or past work may be used to show the level of experience needed to perform at this level. Familiarity with data catalogs and metadata management, proficiency with knowledge management tools (Confluence, SharePoint, wikis, etc.), excellent communication, facilitation, and stakeholder engagement skills, strong organizational and documentational skills, and experience supporting cross-functional teams in a More ❯
media; execute parsing, cataloging, ingestion, and archival processes in accordance with SOPs and mission timelines. Track, validate, and report data delivery status from receipt through dissemination and archival, ensuring metadata completeness and policy compliance. Apply and enforce MDA-approved Standard Operating Procedures (SOPs), Work Instructions, and metadata standards; identify gaps and recommend updates. Verify data format and metadata compliance from … related field is desired. 5-10 years of experience in data management or test data operations; experience with MDA or DoD programs strongly preferred. Familiarity with mission test environments, metadata standards, and secure data handling. Experience in multi-domain data environments (classified/unclassified). Active DoD Secret clearance required; must maintain eligibility throughout employment. Must reside within 50 miles … to relocate. U.S. Citizenship is required. Knowledge/Skills Strong understanding of end-to-end data lifecycle management (collection, validation, ingestion, dissemination, archival). Familiarity with data governance principles, metadata standards, and retention policy enforcement. Hands-on experience with custom or mission-specific data systems; adaptability to non-COTS environments. Working knowledge of SQL or similar query/reporting tools More ❯
years; Manage data ingestion frameworks for Master Data Management (MDM), deep learning pipelines, and predictive analytics using tools such as Airflow, Step Functions, S3, and Redshift. Architect and manage metadata tracking systems integrated with event driven data pipelines, leveraging Amazon DynamoDB, Amazon RDS, AWS Lambda, and EventBridge to capture real-time execution metadata and operational metrics. Implement and manage secure More ❯
secure, agentic AI framework. Some specific tasks will include: Design and implement model ingestion pipelines to onboard new models and datasets Automate extraction, transformation, and loading (ETL) of model metadata (hyperparameters, training datasets, evaluation metrics, licenses) into standardized registries for validation and governance Integrate metadata into unified data ontology structures to enable downstream discoverability and auditability Develop automated model evaluation … models (especially computer vision, geospatial analytics, or time-series) Proficiency in Python, PyTorch/TensorFlow, and ML lifecycle tools (MLflow, Kubeflow, Airflow) Experience building data/model ingestion pipelines, metadata registries, and automated scoring/benchmarking frameworks Familiarity with containerized deployment (Docker/Kubernetes), REST APIs, and distributed architectures (e.g., AWS EventBridge). Working knowledge of model governance, explainable AI More ❯