Hadoop, Spark) or cloud ETL tools (AWS Glue, Azure Data Factory, etc.). Knowledge of BI tools (eg, Pentaho BA Server, Tableau, Power BI). Familiarity with data governance , metadata management , and data quality frameworks . Experience with Python or Java Scripting for ETL customization. Bachelor's degree in Computer Science , Information Systems , Engineering , or related field. If you are More ❯
optimize performance, scalability, and compliance. Key Skills & Experience: Proven experience (8+ years) as a Data Architect, Data Strategy Consultant, or similar role. Strong understanding of data governance, data management, metadata management, and data quality principles. Demonstrated experience delivering data strategies for M&A or corporate restructuring (merger/demerger) projects. Expertise in data discovery, data lineage, and data mapping activities. More ❯
of Work basis, ideally we would prefer candidates with experience working to those arrangements. Essential skills: AWS Data Engineering Snowflake Databricks Responsibilities: Develop and manage data models, schemas, and metadata to support analytics and reporting needs. Collaborate with Data Analysts, Scientists, and Business Analysts to ensure data availability and usability. Implement data quality checks, validation routines, and monitoring to ensure More ❯
integration patterns, ETL/ELT processes, and data pipeline orchestration. Experience with AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract More ❯
integration patterns, ETL/ELT processes, and data pipeline orchestration. Experience with AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract More ❯
in designing and delivering modern data solutions to support advanced analytics and AI-driven initiatives. Key Responsibilities & Experience: Proven expertise in data architecture , including data modelling , analysis , transformation , migration , metadata , and master data management . Track record of delivering data analytics and AI/ML-enabling solutions across complex environments. Hands-on experience with cloud data platforms , ideally AWS (S3 More ❯
profiling skills, with experience in tools such as SQL, Excel, or Power BI. Familiarity with enterprise data platforms (eg, Azure, AWS, Snowflake, or GCP). Understanding of data modelling, metadata, and data quality principles. Ability to interpret technical and business data requirements and translate them into actionable tasks. Strong attention to detail and organizational skills. Excellent communication and documentation abilities More ❯
Wokingham, Berkshire, South East, United Kingdom Hybrid / WFH Options
Stackstudio Digital Ltd
engagement. Data Management & Quality Assurance Establish data governance protocols for spatial data accuracy, completeness, and consistency. Implement ETL workflows using ArcGIS Data Interoperability tools or third-party solutions. Manage metadata standards and spatial data cataloging. Security & Access Control Configure role-based access controls and user permissions. Ensure compliance with data privacy and security standards. Implement secure data sharing protocols with More ❯
and governance frameworks that enable AI and ML systems Partner with Data Scientists, ML Engineers, and Software Engineers to deliver AI Lead data platform modernisation efforts Define data standards, metadata, lineage and quality frameworks Collaborate with stakeholders to align data strategy Data Architect, financial services, AI Reasonable Adjustments: Respect and equality are core values to us. We are proud of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
and governance frameworks that enable AI and ML systems Partner with Data Scientists, ML Engineers, and Software Engineers to deliver AI Lead data platform modernisation efforts Define data standards, metadata, lineage and quality frameworks Collaborate with stakeholders to align data strategy Data Architect, financial services, AI Reasonable Adjustments: Respect and equality are core values to us. We are proud of More ❯
Technical components and Data Engg. related activities for supporting the key capabilities Technical Data Architecture Data Ingestion framework ( Batch/Micro Batch processing ) Data Contract & Data management( including DQ, metadata & lineage) Delivery, Scaling & Op Model Discovery/assessment along with the Data platform requirements Identify and document the relevant data sources to be used for collection & ingestion, cataloging, defining Data More ❯
data governance across Risk and Finance domains. The rate is £500 inside IR35 and 3 days in office (London) Key Requirements: Strong experience in data governance, data quality, and metadata management Solid understanding of BCBS239 and regulatory reporting (e.g. LCR, COREP, PRA110) Exposure to Risk and Finance data domains Hands-on experience with tools like Collibra, SQL, Python, or VBA More ❯
data governance across Risk and Finance domains. The rate is £500 inside IR35 and 3 days in office (London) Key Requirements: Strong experience in data governance, data quality, and metadata management Solid understanding of BCBS239 and regulatory reporting (e.g. LCR, COREP, PRA110) Exposure to Risk and Finance data domains Hands-on experience with tools like Collibra, SQL, Python, or VBA More ❯
using tools such as ERwin, PowerDesigner, or Sparx EA. Uphold data modelling standards and governance practices. Work closely with architecture and process improvement teams to ensure alignment. Contribute to metadata management, data lineage, and master/reference data initiatives. Support data quality and remediation efforts. Facilitate stakeholder workshops and build strong relationships with end users. Communicate effectively across technical and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
using tools such as ERwin, PowerDesigner, or Sparx EA. Uphold data modelling standards and governance practices. Work closely with architecture and process improvement teams to ensure alignment. Contribute to metadata management, data lineage, and master/reference data initiatives. Support data quality and remediation efforts. Facilitate stakeholder workshops and build strong relationships with end users. Communicate effectively across technical and More ❯
data structures. Desirable: Experience working in a cloud-based or hybrid data environment. Background in regulated sectors or large-scale enterprise data integration. Familiarity with data governance frameworks and metadata management. More ❯
data structures. Desirable: Experience working in a cloud-based or hybrid data environment. Background in regulated sectors or large-scale enterprise data integration. Familiarity with data governance frameworks and metadata management. More ❯
medium to another, where appropriate. Understands the relevant organisational policies and procedures and operates to ensure the data is presented effectively. Influences and maintains the data technology architecture, including metadata, integration and business intelligence or data warehouse architecture Has defined authority and accountability for actions and decisions within a significant area of work, including technical, financial and quality aspects. Establishes More ❯
workshops to boost data literacy and enable self-service DQ health checks. Lead with Impact : Contribute to the Data Quality Centre of Excellence (DQ CoE), shaping governance policies and metadata strategies. What You Bring Proven experience designing and deploying enterprise-grade Data Quality Frameworks. Deep knowledge of data quality dimensions: accuracy, completeness, consistency, timeliness, uniqueness, and conformity. Hands-on experience More ❯
Contract): Transition service ownership to internal teams across business units. Lead the rollout of data literacy initiatives, including training and certification. Embed supporting processes such as governance, triage, and metadata practises. Track and optimise adoption through KPIs and feedback loops. Sustain momentum via champion networks, office hours, and regular showcases. Skills & Competencies: Proven experience in delivering self-service analytics at More ❯
process — from data intake and engineering through to manufacturing and delivery. What you’ll be doing: Reviewing the current SharePoint setup and documenting architecture and workflows Improving site structure, metadata and data capture processes Optimising lists and libraries for better performance and usability Refining Power Platform workflows for efficiency and scalability Adding validation rules to strengthen data integrity and traceability More ❯
architects, analysts, and subject matter experts to ensure semantic consistency* Validate mapping outputs and resolve data discrepancies* Ensure compliance with data quality, privacy, and security standards* Maintain documentation and metadata for traceability and governance* Support knowledge sharing and contribute to best practices within the team Essential Requirements:* UK national with at least five years of residency (required for security clearance More ❯
and procurement teams to identify duplicate parts and promote standardization. Interact with customer SMEs to ensure good understanding on their application landscapes, requirements & current challenges. Familiar with capabilities around metadata management, classification systems, part taxonomy, Should Cost assessments, Maintaining Product Score Card, Workflow Integrations to route to right engineer for reviewing the similar parts and take corrective actions, Streamlined lead More ❯
impactful solutions, we want to hear from you. Key Responsibilities Design and implement scalable, secure, and efficient data architectures. Define and enforce data governance standards, including data quality and metadata management. Collaborate with stakeholders to align data solutions with business objectives. Oversee data integration, ETL/ELT processes, and pipeline orchestration. Optimize data storage solutions across relational, NoSQL, and data … lakes). Hands-on experience with data integration patterns and ETL/ELT processes. Proficiency in AWS data platforms and services. Solid understanding of data governance principles (data quality, metadata, access control). Familiarity with big data technologies (Spark, Hadoop) and distributed computing. Advanced SQL skills and proficiency in at least one programming language (Python, Java). Additional Requirements Immediate More ❯
agreed ITSM tool (eg, ServiceNow), maintaining clear documentation. Core VIM Modules and Technical Scope: OpenText VIM Core - Invoice processing, approvals, and exception workflows Document Processing (DP) Indexing - Indexing invoice metadata and validation Exception Handling Framework (EHF) - Management and resolution of standard and custom exceptions VIM Analytics - Monitoring throughput, backlog, and exception trends Business Rules Framework Plus (BRF+) - Rule-based invoice More ❯