London (1 Horse Guards Road), Greater London, England
Government Digital & Data
and engage with, support and lead on the finance data requirements and design for any future systems required for central financial reporting within Government and HM Treasury. Showcase excellent metadata quality management and embed effective processes and procedures in product lifecycles to drive high quality outputs and standards, utilising Project, and Programme Management methodologies, best practices, and ways of working. … dictionary https://assets.publishing.service.gov.uk/media/5b27cf2240f0b634b469fb1a/CS_Behaviours_2018.pdf This role will assess you at the Grade 7 level: Proven experience of establishing data standards, metadata management, data structures, and expertise in data analysis. Experience of Digital Solutions such as ERP (Enterprise Resource Platforms) and EPM (Enterprise Performance Management) Technology, including the best practice application of More ❯
promote the value of data across business and technology teams. Mentor data engineers, BI developers, and other stakeholders in data architecture and best practices. Maintain comprehensive architecture documentation for metadata, data lineage, and data governance. Essential Skills & Qualifications Extensive experience architecting enterprise data solutions and designing scalable data platforms. Proven expertise in data modeling (conceptual, logical, physical), metadata management, and More ❯
and data governance Implement data quality monitoring and alerting processes. Work with data governance teams to ensure compliance with data governance policies and standards. Implement data lineage tracking and metadata management processes. Collaboration & Communication: Collaborate closely with data scientists, economists, and other technical teams to understand data requirements and translate them into technical solutions. Communicate technical concepts effectively to both More ❯
and data governance Implement data quality monitoring and alerting processes. Work with data governance teams to ensure compliance with data governance policies and standards. Implement data lineage tracking and metadata management processes. Collaboration & Communication: Collaborate closely with data scientists, economists, and other technical teams to understand data requirements and translate them into technical solutions. Communicate technical concepts effectively to both More ❯
and data governance Implement data quality monitoring and alerting processes. Work with data governance teams to ensure compliance with data governance policies and standards. Implement data lineage tracking and metadata management processes. Collaboration & Communication: Collaborate closely with data scientists, economists, and other technical teams to understand data requirements and translate them into technical solutions. Communicate technical concepts effectively to both More ❯
managing large-scale data solutions on Microsoft Azure. Unity Catalog Mastery: In-depth knowledge of setting up, configuring, and utilizing Unity Catalog for robust data governance, access control, and metadata management in a Databricks environment. Databricks Proficiency: Demonstrated ability to optimize and tune Databricks notebooks and workflows to maximize performance and efficiency. Experience with performance troubleshooting and best practices for More ❯
problems. Key responsibilities: Data Platform Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive logging, monitoring, and alerting tools to manage the … Spark. Proven experience working with Azure data platform services, including Storage, ADLS Gen2, Azure Functions, Kubernetes. Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark … Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage, Quality Checks, Master Data Management. Experience using Azure DevOps to manage tasks and CI/CD deployments within an Agile framework, including utilising Azure Pipelines (YAML), Terraform, and implementing effective release and branching strategies. Knowledge of More ❯
problems. Key responsibilities: Data Platform Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive logging, monitoring and alerting tools to manage the … Spark. Proven experience working with Azure data platform services, including Storage, ADLS Gen2, Azure Functions, Kubernetes. Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark … Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage, Quality Checks, Master Data Management. Experience using Azure DevOps to manage tasks and CI/CD deployments within an Agile framework, including utilising Azure Pipelines (YAML), Terraform, and implementing effective release and branching strategies. Knowledge of More ❯
solution architects, and data product teams to gather requirements and design solutions. Provide guidance on data integration, transformation, and migration strategies. Establish and maintain enterprise data models, data dictionaries, metadata repositories, and data lineage documentation. Ensure data models comply with organizational policies and regulatory requirements. Optimize data products and their components for performance, scalability, and reliability. Evaluate and recommend data … tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business empowering local teams with exciting work that is making a difference. Our portfolio includes consulting, applications, business More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
. Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or More ❯
. Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or More ❯
. Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or More ❯
. Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or More ❯
South East London, England, United Kingdom Hybrid / WFH Options
83data
. Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). Desirable: Background in fintech or More ❯
workflows using technologies such as BigQuery, Dataflow, Informatica, or IBM DataStage, supporting both real-time and batch data processing. Governance & Data Quality Management: Establish comprehensive data governance frameworks, including metadata management and quality assurance, using platforms like Unity Catalog, Alation, Profisee, or DQ Pro. Strategic Advisory & Stakeholder Engagement: Serve as a strategic partner to executive stakeholders (e.g., CDOs, CIOs, Heads More ❯
data quality, integrity and compliance with governance policies and regulatory standards Collaborate with enterprise architects, DBAs and infrastructure teams to optimize data performance security Develop and maintain data lineage, metadata and architecture documentation Support reporting and analytics initiatives by ensuring data availability and consistency across systems Assist in designing data integration using tools such as OIC (Oracle Integration Cloud), Oracle More ❯
data models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and More ❯
data models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom
In Technology Group Limited
data quality, integrity and compliance with governance policies and regulatory standards Collaborate with enterprise architects, DBAs and infrastructure teams to optimize data performance security Develop and maintain data lineage, metadata and architecture documentation Support reporting and analytics initiatives by ensuring data availability and consistency across systems Assist in designing data integration using tools such as OIC (Oracle Integration Cloud), Oracle More ❯
principles of data modelling Re-engineer data pipelines to be scalable, robust, automatable, and repeatable Navigate, explore and query large scale datasets Build processes supporting data transformation, data structures, metadata, dependency and workload management Identify and resolve data issues including data quality, data mapping, database and application issues Implement data flows to connect operational systems, data for analytics and business More ❯
principles of data modelling Re-engineer data pipelines to be scalable, robust, automatable, and repeatable Navigate, explore and query large scale datasets Build processes supporting data transformation, data structures, metadata, dependency and workload management Identify and resolve data issues including data quality, data mapping, database and application issues Implement data flows to connect operational systems, data for analytics and business More ❯
responsible for maintaining strategic product direction and assisting the development team in the successful delivery of initiatives to develop scalable, reusable data transformation logic assets (SQL scripts, DBT models, metadata sets) that integrate seamlessly within the data platform, all within a robust data governance framework. Key Responsibilities: Data Governance Strategy & Execution: Develop, own, and execute the data governance roadmap specifically More ❯
data quality, integrity and compliance with governance policies and regulatory standards. Collaborate with enterprise architects, DBAs and infrastructure teams to optimize data performance security. Develop and maintain data lineage, metadata and architecture documentation. Support reporting and analytics initiatives by ensuring data availability and consistency across systems. Assist in designing data integration using tools such as OIC (Oracle Integration Cloud), Oracle More ❯
data quality, integrity and compliance with governance policies and regulatory standards. Collaborate with enterprise architects, DBAs and infrastructure teams to optimize data performance security. Develop and maintain data lineage, metadata and architecture documentation. Support reporting and analytics initiatives by ensuring data availability and consistency across systems. Assist in designing data integration using tools such as OIC (Oracle Integration Cloud), Oracle More ❯
conceptual, logical and physical data models to provide structured view of data domains, entities, and their relationships. Data Documentation: Create and update data dictionaries, entity-relationship diagrams (ERDs), and metadata to ensure clarity and consistency. Stakeholder Collaboration: Collaborate closely with business stakeholders to understand data requirements and translate them into structured data models that meet business needs. Data Governance Alignment More ❯