into data integrations and model designs. Develop models on the Oracle EPBCS platform including prototype builds, testing and business integration. Drive development of model metadata, calculations/rules, forms, dashboards, and analytical solutions. Build data integrations to move data into, out of, and between models. Drive training, knowledge transfer and More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Michael Page (UK)
. Extensive knowledge of working with data protection and GDPR compliance. Comprehensive understanding of data management and governance practices including data quality, data security, metadata, master data management. Understanding of technical tools to support data governance practices, e.g. MS Purview. Leading and managing a team including work prioritisation and task More ❯
role requires strong communication skills to collaborate effectively with both technical and non-technical stakeholders. You will be responsible for data governance, data modelling, metadata management, and ensuring compliance with data standards. Key Responsibilities: 1. Data Architecture & Engineering Design and implement scalable data architectures that align with business objectives. Work … Governance & Compliance Ensure adherence to data governance policies and regulatory requirements. Monitor and maintain compliance with data standards and security measures. Develop and manage metadata repositories for accurate data tracking and integration. Support data assurance processes and provide recommendations for compliance improvements. 3. Data Analysis & Communication Conduct data profiling and … Strong proficiency in Java, SQL, Python, SparkSQL, and PySpark. Experience with Microsoft Power Platform (PowerApps, Power Automate, etc.). Good understanding of data governance, metadata management, and compliance frameworks. Ability to communicate effectively with both technical and non-technical stakeholders. Experience in data modelling, data profiling, and source system analysis. More ❯
Perform comprehensive data analysis, validation, and integrity checks. Data Governance & Documentation Ensure compliance with data governance standards and regulatory requirements. Maintain accurate data dictionaries, metadata , and workflow documentation for transparency and team collaboration. Optimization & Best Practices Optimize dashboard and data pipeline performance through best practices in data visualization and coding … environments . Familiarity with internal audit processes , risk management , or compliance frameworks . Understanding of data lineage , data governance tools , and best practices in metadata management . Why Join Us? Work on impactful projects that support critical audit and risk operations Be part of a collaborative, expert-driven environment Access More ❯
e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. cloud/SAP)/data architecture Understanding of public and private More ❯
e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. cloud/SAP)/data architecture Understanding of public and private More ❯
ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyse and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as More ❯
leeds, west yorkshire, yorkshire and the humber, United Kingdom
Anson McCade
ensure successful handover to internal teams. Design and implement ETL/ELT pipelines for cloud data warehouse solutions. Build and maintain data dictionaries and metadata management systems . Analyze and cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as More ❯
or similar tools. • Strong Python and SQL skills, with a good understanding of data formats • Demonstrated experience in designing data lakes, data pipelines, and metadata management. • Solid understanding of CI/CD pipelines, infrastructure automation, and agile delivery. • Background in consulting, including stakeholder engagement and client-facing delivery. • Proven success More ❯
specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. A successful history of manipulating, processing, and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and More ❯
enhancing operational efficiency and reliability. Data Governance and Quality: Embed governance, security, and quality practices. Define access control, data lineage, and compliance standards. Promote metadata management and enforce standards to ensure data trustworthiness. Stakeholder Engagement: Collaborate with leadership and product owners to align data priorities and maximize value from data More ❯
governance, security, and data quality practices into engineering workflows. Define guardrails and reference implementations for data access control, data lineage, and compliance. Promote consistent metadata management and enforce technical standards to ensure trust in data assets. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D More ❯
sources. Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, AWS - Redshift, and OLAP technologies, Model data and metadata for ad hoc and pre-built reporting. Work with product tech teams and build robust and scalable data integration (ETL) pipelines using SQL, Python and More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Michael Page
will be responsible for designing and managing GPA's enterprise data models to support design and deployment of business systems: Design data models and metadata systems Help Chief Data Architects to interpret an organisation's needs Provide oversight and advice to other data architects who are designing and producing data More ❯
management of extremely large datasets. From Day 1, you will be challenged with a variety of tasks, ranging from creating datasets, reports, dashboards to metadata modeling, pipeline monitoring. You will interact with internal program and product owners, and technical teams to gather requirements, structure scalable and perform data solutions, and More ❯
the financial community. Our News Platform aggregates content from Reuters, and thousands of other news providers, augmenting our content with a rich set of metadata and analytics to enable our customers in finding News most relevant to their needs. The News we deliver informs investors, moves markets, and enables companies More ❯
and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ML teams to enable model training pipelines with robust and reliable data access. Leverage metadata and structured data modeling to support AI model explainability and audit trails. Guide engineering teams on best practices for cloud-based data handling, Terraform, and More ❯
to gather use case requirements, advise on how to unlock more value, prioritise and translate them into technical data requests. Identify user data, content metadata, and campaign data to ingest into the CDP, and how this data delivers audience targeting, campaign optimization, and post-campaign measurement. Maintain and build new More ❯
Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive logging, monitoring, and … Azure data platform services, including Storage, ADLS Gen2, Azure Functions, Kubernetes. Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages … Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage, Quality Checks, Master Data Management. Experience using Azure DevOps to manage tasks and CI/CD deployments within an Agile framework, including More ❯
Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive logging, monitoring and … Azure data platform services, including Storage, ADLS Gen2, Azure Functions, Kubernetes. Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages … Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage, Quality Checks, Master Data Management. Experience using Azure DevOps to manage tasks and CI/CD deployments within an Agile framework, including More ❯
/EDMCS/ARCS/EPCM/PCMCS/TRCS/NR. Expertise in Oracle EPM cloud functionality such as data management, security, reporting, metadata, forms, task manager, task lists, smart lists, workflows, EPMAutomate, etc. Proficiency in designing and implementing EPM data integration solutions. Good understanding of financial statements and More ❯
ensure buy-in and adoption across the organisation. Support the selection or integration of data governance tools as needed. Drive data quality , lineage, and metadata practices that align with regulatory requirements (including regulatory reporting ). Educate and support staff at all levels to build governance awareness and culture. Ensure solutions More ❯
ensure buy-in and adoption across the organisation. Support the selection or integration of data governance tools as needed. Drive data quality , lineage, and metadata practices that align with regulatory requirements (including regulatory reporting ). Educate and support staff at all levels to build governance awareness and culture. Ensure solutions More ❯
environments Strong SQL skills and experience with relational databases Knowledge of CI/CD processes and infrastructure-as-code principles Experience with data cleansing, metadata management, and data dictionaries Familiar with modern data visualisation tools (e.g. QuickSight, Tableau, Looker, QlikSense) Desirable Skills Exposure to large-scale data processing tools (Spark More ❯
Salford, Manchester, United Kingdom Hybrid / WFH Options
Spyro Soft
Senior Software Engineer who lives and breathes front-end development to join us working with our media client. You will be joining the Content Metadata engineering team to accelerate the development and delivery of the outcomes defined in the Passport Control evolution and the Contributors use case. You will have More ❯