of these investments to ensure they deliver expected outcomes and benefits. Deliver a cohesive Group Data Model to ensure consistent master data, reference data, metadata, and optimized data pipelines across processes and systems. Collaborate with market analysts, vendors, and partners to identify technological opportunities, and provide thought leadership on Data More ❯
of these investments to ensure they deliver expected outcomes and benefits. Deliver a cohesive Group Data Model to ensure consistent master data, reference data, metadata, and optimized data pipelines across processes and systems. Collaborate with market analysts, vendors, and partners to identify technological opportunities, and provide thought leadership on Data More ❯
of these investments to ensure they deliver expected outcomes and benefits. Deliver a cohesive Group Data Model to ensure consistent master data, reference data, metadata, and optimized data pipelines across processes and systems. Collaborate with market analysts, vendors, and partners to identify technological opportunities, and provide thought leadership on Data More ❯
data processes, managing compliance requirements, and conducting quality assurance tasks. Ability to troubleshoot and resolve data quality issues and ensure system integrity. Familiarity with metadata management, ensuring compliance with data governance frameworks, including HRPP/Privacy/HIPAA/SOP. CompTIA Security+ certification, demonstrating a commitment to security and compliance More ❯
data governance artifacts and support audits and compliance requirements. Proficient in ESRI suite of tools, geodatabase technologies and web capabilities Understanding of geospatial data metadata, standards, and implementation ESRI Model Builder and/or Python/ArcPy experience preferred Company EEO Statement Accessibility/Accommodation: If because of a medical More ❯
validation and pipeline testing frameworks to ensure data reliability and accuracy. Design and execute test cases for ETL/ELT workflows, data transformations, and metadata-driven data pipelines . Validate platform engineering components , including infrastructure automation, CI/CD pipelines, and cloud-based data services . Collaborate closely with data More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
and high throughput. Execute practices such as continuous integration and test driven development to enable the rapid delivery of working code. Design and build metadata driven data pipeline using Python and SQL in accordance with guidelines set by the Data Architect Ship medium to large features independently using industry standard More ❯
analysis, and risk analysis Java, Python AWS and AWS tool suite for data management optimization Knowledge of enterprise data strategy, data governance, data architecture, metadata management, and performance measurement techniques. Experience in full life-cycle software development projects. Desired Qualifications Ability to effectively communicate technical information to non-technical audiences. More ❯
Agile Delivery, AI and Innovation, Business Process Mapping, Change Management, Content Syndication, Data Collection and Analysis, Digital Shelf Optimization, Growth Mindset, IT Data Management, Metadata and Taxonomy Design, Platform Strategy, Product Development, Review and Reporting, Stakeholder Engagement, Verbal and Written Communication Competencies : Cultivates Innovation, Customer Focus, Decision Quality, Drives Results More ❯
team usingCI/CD tools. Experience in Snowflake/AWS Hive and ability to write/analyze SQL queries. Industry experience in master data, metadata, data architecture, data governance, data quality and data modeling. Preferred: Experience in Business Intelligence tools such as PowerBI and Tableau in pharma industry is a More ❯
internal and external communities. Advise and support the design for the configuration of the EDGC, including but not limited to; data catalog, data inventory, metadata standards, data lineage, and impact analysis of complex data-to-business and data-to-technology linkages that expose risks that would otherwise be unknown (secondary More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
dimensional models in the gold layer built for analytical use. Strong understanding and or use of unity catalog alongside core databricks functionality to drive metadata management. Strong understanding of cloud economics, including cost management strategies and optimising solutions for customer needs. Experience with infrastructure as code, proficiency using tools such More ❯
in BI/Digital Reporting. Expert knowledge of BI solutions and their component tiers: Data Integration/ETL, Data Warehouses/Marts, Business/Metadata layers, Dashboards/Reports, etc. A diversity of skills and knowledge of federal financial and procurement management best practices, solutions, innovations, standards and configurations across More ❯
discussions. Bachelor's degree in Computer Science, Information Technology, or a related field. Preferred qualifications: Expertise in configuring/customizing Collibra, data governance frameworks, metadata management, and data quality. Proficiency in SQL, databases (e.g., Oracle, SQL Server), and familiarity with data integration tools. Ability to translate business requirements into technical More ❯
creating data models and data flows in collaboration with the business and technical teams. Data Catalog: Assist in developing and maintaining data dictionaries and metadata repositories. Data Analysis: Analyze business data to identify opportunities for improvement. Stakeholder Communication: Serve as a point of contact between the data project team and More ❯
time - ETL vs Spark run time, etc. Security controls around authentication, authorizations, encryption and certificates Recommend the migration tools and design data migration strategy Metadata and data catalogs Excellent communication skills, preparing PowerPoint presentations, executive readouts Key Experience Successfully built and defended solution architecture with different levels of technical stakeholders More ❯
and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic environment. ·Strong communication skills to collaborate with remote teams (US More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
and Pandas. Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic environment. Strong communication skills to collaborate with remote teams (US More ❯
on data Improve the quality of data use and usability by driving an understanding and adherence to the principles of data quality management, including metadata, lineage, and business definitions Work collaboratively with Intelligence and Data analysis teams to produce qualitative and quantitative data that support Intelligence products Other duties as More ❯
newcastle-upon-tyne, tyne and wear, north east england, united kingdom
Oliver Bernard
quality monitoring metrics and data quality rule creation. Operation and incremental build-out of the data platform using in-house and vendor data tools. Metadata analysis and data catalogue curation to support data discoverability. Day-to-day data subject matter expert support for the front office across various data sets. More ❯
Expertise in data architecture concepts such as dimensional modelling, data vault, data mesh Understanding of data management and governance concepts such as data quality, metadata management, etc. Experience in designing data engineering solutions using open source and proprietary cloud data pipeline tools Ability to implement data processing and transformation pipelines More ❯
bio-pharmaceutical industry Working knowledge of the drug discovery and development process Familiarity with life science resources (e.g., public or commercial databases) and descriptive metadata for small molecule and biologics research (e.g. omics) Experience designing and developing taxonomies, reference data, or controlled vocabularies Additional Information Applicable only to applicants applying More ❯
Knowledge of GCP Kubernetes would be an advantage. Knowledge of data warehousing concepts with a good understanding of dimensional models. Experience in implementing a metadata framework for data ingestion, data quality, and ETL. Good communication skills and ability to manage IT stakeholders. More ❯
SQL. • Ability to interpret complex data requirements and architect solutions. • Experience with SAP Power Designer or a similar data modeling tool. • Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. • Experience with Microsoft Azure Synapse is a plus. • Excellent presentation, communication (oral More ❯
and implement data management strategies, policies, and procedures to ensure data integrity, quality, and security. Design and optimize data governance frameworks, including data standards, metadata management, data classification, and data lineage. Assess existing data management processes and systems, identify areas for improvement, and develop strategies for data process optimization. Lead More ❯