best practices: Promote robust data management, including lineage, observability, access control, and compliance with ethical data use. Innovation&standards: Stay ahead of industry trends in data architecture, engineering, and metadata/semantic technologies—and bring them into practice where they add value. Enterprise data architecture: Collaborate with other architects to define and implement data architecture patterns across systems and domains More ❯
you solve challenging problems together. Minimum Requirements Experience across UK Healthcare sector or Public Sector Strong technical design expertise in core data architecture disciplines including data modelling, data analysis, metadata management, data transformation, data migration and master data. Track record of providing technical leadership within data projects including assurance, mentoring and standards definition. Aware of best practice techniques and methodologies. More ❯
Proven expertise in designing, developing, and maintaining scalable data pipelines, ETL/ELT processes, and integrations to support advanced analytics. Experience with data governance frameworks, master data management (MDM), metadata management, and ensuring data compliance with global standards. Deep understanding of SQL, Python, Spark, or other relevant data processing technologies used for data transformation and analytics enablement. Familiarity with modern … initiatives, including the DDF program. Data Platform Optimization: Work closely with AOE data teams and PNE analytics teams to optimize the data infrastructure, ensuring performance, scalability, and cost efficiency. Metadata & Asset Management: Drive consistent metadata management and data asset governance, ensuring data reliability, accessibility, and standardization across PNE. Enablement & Best Practices: Educate and support the PNE teams in data stewardship More ❯
practices, ensuring high-quality, reliable data is available for critical reporting, AI model training and data transformation. Ensure that data stewardship practices are applied to new data initiatives, including metadata capture, quality controls, and data privacy measures. Lifecycle Integration: Ensure that AI and data governance standards are fully integrated into the development lifecycle, from initial design through to deployment and More ❯
logic in compliance with regulations, Extending functionality and automation through serverless functions to address WebFlow limitations and streamline workflows. CMS Architecture & Webflow Management Evaluating CMS structure and optimize fields, metadata, templates, and workflows to make content management intuitive for marketing. Technical SEO/AEO Implementation Implement changes recommended by our SEO team around schema markup, href logic, advanced redirects, sitemap More ❯
triage and manage documentation lifecycle (e.g., maintain, archive, obsolete). Work closely with Document Controllers and service provider admins to implement disposition actions and track them to closure. Align metadata between internal and external DMS platforms to ensure lifecycle consistency. Drive improvements in external DMS functionality and design. Implement formal archiving processes for external documentation. Guide business users in accessing More ❯
AI acceleration. The Rubrik Security Cloud platform is designed to deliver robust cyber resilience and recovery including identity resilience to ensure continuous business operations, all on top of secure metadata and data lake. Rubrik's offerings also include Predibase to help further secure and deploy GenAI while delivering exceptional accuracy and efficiency for agentic applications. Linkedin | X (formerly Twitter) | Instagram More ❯
deployment approaches (e.g. cloud container services (preferred), Docker or Kubernetes) and CI/CD practices. experience working with public-facing data platforms or open data services, including data modelling, metadata, or data standards. the ability to pass security clearance, backed by the right to work in the UK About The Team Within PDS, the Data & Search Team enables data-driven … cloud container services (preferred), Docker or Kubernetes) and CI/CD practices. Criterion 6 (Desirable) Experience working with public-facing data platforms or open data services, including data modelling, metadata, or data standards. Criterion 7 (Desirable) Familiarity with Infrastructure as Code (IaC) tools (e.g., Terraform, Bicep, ARM, or CloudFormation) and data observability or monitoring practices. More ❯
for a highly experienced Senior Data Engineer with deep expertise in Databricks and Azure data services. The ideal candidate will have a strong background in designing and delivering scalable, metadata-driven data solutions, particularly in metadata ingestion and orchestration processes. The role requires a strategic thinker who can lead technical design discussions, drive best practices, and mentor junior engineers. The … comfortable managing multiple deliverables, working closely with stakeholders, and implementing robust data solutions to enable business insights and operational efficiencies. Responsibilities The successful candidate will: Lead the development of metadata ingestion frameworks and orchestration processes. Design and implement scalable and reusable metadata-driven services to optimise data pipelines. Architect and maintain robust data solutions using Databricks and Azure services (such … with a strong focus on Databricks (SQL & PySpark). Extensive experience with Azure data services, including ADF, Synapse, ADLS, and Azure Functions. Proven track record of designing and implementing metadata-driven data pipelines. Deep expertise in orchestration and data workflow automation e.g. Airflow, DBT. Strong understanding of CI/CD practices for data engineering. Experience with infrastructure as code (Terraform More ❯