Manchester, North West, United Kingdom Hybrid / WFH Options
Anson Mccade
discovery, evaluating source systems and APIs, and creating stories for implementation Collaborate with third parties to ensure interoperability and integration compatibility Champion best practices in warehousing techniques (e.g. Kimball), metadata management, and performance optimisation Support delivery through integration testing, reconciliation, and reporting visualisation (e.g. Power BI, Tableau) About You: Proven experience designing data architectures for large-scale platforms and diverse More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Capgemini
conceptual, logical and physical data models to provide structured view of data domains, entities, and their relationships. Data Documentation: Create and update data dictionaries, entity-relationship diagrams (ERDs), and metadata to ensure clarity and consistency. Stakeholder Collaboration: Collaborate closely with business stakeholders to understand data requirements and translate them into structured data models that meet business needs. Data Governance Alignment More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
The Very Group
in accordance with organisational policies around PII data treatments. You will be accountable for the data catalogue and master data model across TVG, defining the key roles to ensure metadata is regularly updated. You will also ensure data stewardship is defined and applied to ensure data is curated and profiled for search and discovery. You will define and measure data More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. • Ensure data quality, integrity, and security throughout the data pipeline. … Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building metadata driven pipelines. • Knowledge of Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and More ❯