managing large-scale data solutions on Microsoft Azure. Unity Catalog Mastery: In-depth knowledge of setting up, configuring, and utilizing Unity Catalog for robust data governance, access control, and metadata management in a Databricks environment. Databricks Proficiency: Demonstrated ability to optimize and tune Databricks notebooks and workflows to maximize performance and efficiency. Experience with performance troubleshooting and best practices for More ❯
Models using Data Vault and Dimensional modelling methods Implement automated, reusable and efficient batch data pipelines and streaming data pipelines Work closely with Governance and Quality teams to ensure metadata, catalogue, lineage and known solutions to data issues are optimised Work closely with other data specialists and technicians in the organisation, such as Actuarial and Finance, to help with their More ❯
Design and build robust data pipelines with Cloud Composer, Dataproc, Dataflow, Informatica, or IBM DataStage, supporting both batch and streaming data ingestion. Data Governance & Quality: Implement data governance frameworks, metadata management, and data quality controls using Unity Catalog, Profisee, Alation, DQ Pro, or similar platforms. Client Engagement & Advisory: Act as a trusted advisor to clients senior leadership (CDOs, CIOs, Heads More ❯
Lead the design and implementation of data models, pipelines, and integration frameworks. Support the implementation of Customer Data Platforms (CDPs), MDM, and CRM systems. Ensure compliance with data governance, metadata standards, and data quality frameworks. Collaborate with stakeholders to define data architecture vision and roadmap. Contribute to the development of reusable architecture assets and design patterns. Mentor junior team members More ❯
on strategic, cross-functional data initiatives with C-level stakeholders. Familiarity with cloud data platforms (e.g., Azure, AWS, GCP, Snowflake). Knowledge of data governance standards, regulatory compliance, and metadata management. Experience with BI and visualization tools such as Power BI, Tableau, or Looker. Certification in data science, analytics, or cloud technologies (e.g., Microsoft, AWS, Google). Why join Genpact More ❯
on strategic, cross-functional data initiatives with C-level stakeholders. Familiarity with cloud data platforms (e.g., Azure, AWS, GCP, Snowflake). Knowledge of data governance standards, regulatory compliance, and metadata management. Experience with BI and visualization tools such as Power BI, Tableau, or Looker. Certification in data science, analytics, or cloud technologies (e.g., Microsoft, AWS, Google). Why join Genpact More ❯
APIs, and machine learning models in coordination with DevOps teams. Define and embed best practices for data warehousing and Lakehouse architecture (e.g. Medallion). Standardise data development, modelling, and metadata management across the enterprise. Implement lineage tracking and orchestration tools to improve transparency and governance. Conduct code reviews, testing, and documentation to ensure quality and robustness of analytics outputs. Take More ❯
/or Scala. * Expertise in ETL/ELT processes, data warehousing, and data mesh architectures. * Familiarity with AI/ML concepts and their application in data analytics. * Experience with metadata management, data lineage tracking, and data cataloguing. * Knowledge of serverless data processing, event-driven architectures, and modern data stacks. In accordance with the Employment Agencies and Employment Businesses Regulations More ❯
data from SAP (e.g., S/4HANA, BW/4HANA) and non-SAP sources (e.g., cloud platforms, APIs, third-party systems). Establish and enforce data governance policies including metadata management, data lineage tracking, and access control to ensure data integrity and compliance. Optimize data pipelines and transformation logic using SAP Datasphere's capabilities to support real-time and batch More ❯
Governance and Quality Assurance: Embed governance, security, and data quality practices into engineering workflows. Define guardrails and reference implementations for data access control, data lineage, and compliance. Promote consistent metadata management and enforce technical standards to ensure trust in data assets. Stakeholder Engagement: Collaborate with PN D&A leadership, PN product owners, and segment D&A leadership to synchronize and More ❯
meetings 2. Data Governance & Quality: Establish and enforce data governance policies, standards, and procedures Define data quality rules and implement processes for data validation and cleansing Design and implement metadata management strategies Architect secure data storage, access controls, and data privacy measures Develop and implement Master Data Management (MDM) strategies Partner closely with the Enterprise Architect, Integration Architect, business stakeholders More ❯
governance, security, and compliance strategies. Lead the integration of core data management capabilities , including: Master Data Management (MDM) solutions for single-source-of-truth data Data catalogue tools for metadata management and discovery Data governance frameworks for policy enforcement, compliance, and lineage tracking Data quality solutions to ensure data accuracy, consistency, and reliability Provide architectural leadership on hub-and-spoke More ❯
into technical specifications and ensure timely delivery of data products. 4. Data Quality & Governance Implement monitoring systems to ensure data accuracy, completeness, and timeliness. Support data governance initiatives, including metadata management, lineage tracking, and access controls. Ensure compliance with regulatory standards (e.g., REMIT, EMIR) and internal audit requirements. 5. Innovation & Continuous Improvement Evaluate and adopt new technologies (e.g., streaming platforms More ❯
and analytics. Data Modelling & Design: Design and implement logical and physical data models that support the analytical and modelling requirements of the platform. Define data dictionaries, data lineage, and metadata management processes. Ensure data consistency, integrity, and quality across the platform. Data Integration & Pipelines: Define data integration patterns and establish robust data pipelines for ingesting, transforming, and loading data from … financial markets is highly desirable. Excellent communication, interpersonal, and presentation skills. Strong analytical and problem-solving skills. Experience with data visualisation tools (e.g., Tableau, Power BI). Experience with metadata management tools e.g. Purview. Knowledge of data science and machine learning concepts. Experience with API design and development. #J-18808-Ljbffr More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
discovery, evaluating source systems and APIs, and creating stories for implementation Collaborate with third parties to ensure interoperability and integration compatibility Champion best practices in warehousing techniques (e.g. Kimball), metadata management, and performance optimisation Support delivery through integration testing, reconciliation, and reporting visualisation (e.g. Power BI, Tableau) About You: Proven experience designing data architectures for large-scale platforms and diverse More ❯
data from SAP (e.g., S/4HANA, BWS/4HANA) and non-SAP sources (e.g., cloud platforms, APIs, third-party systems). * Establish and enforce data governance policies including metadata management, data lineage tracking, and access control to ensure data integrity and compliance. * Optimize data pipelines and transformation logic using SAP SAC's capabilities to support real-time and batch More ❯
data from SAP (e.g., S/4HANA, BWS/4HANA) and non-SAP sources (e.g., cloud platforms, APIs, third-party systems). * Establish and enforce data governance policies including metadata management, data lineage tracking, and access control to ensure data integrity and compliance. * Optimize data pipelines and transformation logic using SAP Datasphere's capabilities to support real-time and batch More ❯
principles of data modelling Re-engineer data pipelines to be scalable, robust, automatable, and repeatable Navigate, explore and query large scale datasets Build processes supporting data transformation, data structures, metadata, dependency and workload management Identify and resolve data issues including data quality, data mapping, database and application issues Implement data flows to connect operational systems, data for analytics and business More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Become
of BCBS239, risk and finance data structures, and data governance best practices Proven experience influencing and engaging senior stakeholders across complex organisations Strong working knowledge of data quality, profiling, metadata management, and governance tools (Collibra preferred) Analytical mindset with the ability to challenge constructively and solve problems creatively Experience working in Tier 1/Tier 2 banking, with a solid More ❯
implementing cloud data migration and storage patterns on one or more of AWS, GCP and Microsoft Azure Experience implementing and integrating data management platforms for data cataloguing, classification and metadata management. Experience designing and developing data privacy, security and entitlements frameworks for cloud provider ecosystems (AWS, Azure, GCP). Good understanding of cloud networking architecture, operations, automation and cost management. More ❯
etc. Conceptual and Logical design of the Data & Analytics Platform; together with experience of data repository consolidation and end user computing Supporting Data Governance colleagues in areas such as metadata management, data lineage, data audits and general data quality improvement Experience of implementing Master Data Management approaches and tooling Selection of external systems/services with a data component - in More ❯
conceptual, logical and physical data models to provide structured view of data domains, entities, and their relationships. Data Documentation: Create and update data dictionaries, entity-relationship diagrams (ERDs), and metadata to ensure clarity and consistency. Stakeholder Collaboration: Collaborate closely with business stakeholders to understand data requirements and translate them into structured data models that meet business needs. Data Governance Alignment More ❯
policies, underwriting Effective stakeholder communication and delivery focus Strong project and time management in fast-paced environments Desirable Tools and Tech Power BI, Tableau, Looker, Domo Data governance and metadata management tools DevOps for data workflows, hybrid/multi-cloud architectures Monitoring and cost optimisation tools (Azure Monitor, GCP cost tools) Culture and Conduct Act as a role model for More ❯
skills. Preferred Qualifications: Cloud certification (Azure Data Engineer, AWS Certified Database Specialty, etc.). Familiarity with NoSQL or NewSQL databases (MongoDB, Cassandra). Experience with data governance, MDM, and metadata management tools. Working knowledge of DevOps tools (Terraform, Git, Jenkins). More ❯
Islington, London, United Kingdom Hybrid / WFH Options
National Centre for Social Research
and AI developments Design for multi-tenancy and workload isolation between BI and research environments.BI and operational reporting Establish data architecture patterns for bronze/silver/gold layers, metadata management, and schema evolution. Build and maintain CI/CD workflows, job orchestration (e.g. Databricks Workflows or Airflow), and cluster management policies. Lead performance tuning, cost optimisation, and observability across More ❯