Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. • Ensure data quality, integrity, and security throughout the data pipeline. … Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building metadata driven pipelines. • Knowledge of Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and More ❯
Stockton-on-Tees, Cleveland, England, United Kingdom
Reed
organisational skills and attention to detail. Preferred Qualifications Microsoft Azure Data Engineer certification. Experience with Azure Synapse Analytics and Power BI. Familiarity with data governance, data quality frameworks, and metadata management. Key Behaviours & Competencies Personal Drive – Resilient and committed to achieving goals through persistent effort. Innovation – Generates new ideas and solutions through original thinking. Results Focus – Aligns team efforts with More ❯
organisational skills and attention to detail. Preferred Qualifications Microsoft Azure Data Engineer certification. Experience with Azure Synapse Analytics and Power BI. Familiarity with data governance, data quality frameworks, and metadata management. Key Behaviours & Competencies Personal Drive - Resilient and committed to achieving goals through persistent effort. Innovation - Generates new ideas and solutions through original thinking. Results Focus - Aligns team efforts with More ❯
optimise resource utilisation, and ensure scalable and budget-conscious data processing workflows. Provide architectural oversight for platform enhancements, integrations, and performance optimisation. Champion best practices in data architecture, including metadata management, data quality, and access control. Engage with stakeholders across business and technology to understand requirements and translate them into scalable platform capabilities. Support the onboarding of new data domains More ❯
principles of data modelling Re-engineer data pipelines to be scalable, robust, automatable, and repeatable Navigate, explore and query large scale datasets Build processes supporting data transformation, data structures, metadata, dependency and workload management Identify and resolve data issues including data quality, data mapping, database and application issues Implement data flows to connect operational systems, data for analytics and business More ❯
user needs when designing deliverables. Experience in designing technology solutions with complex end-to-end data flows. Experience in implementing data governance, including data cataloging, data lineage tracking, and metadata management to ensure data accuracy, accessibility, and compliance. Preferred: Experience with Databricks Understanding of how data platforms interact with marketing and customer engagement platforms. Knowledge of service-oriented architecture, including More ❯
less development, testing, bug/defect resolution, coding, design, customization and documentation of micro services, creating/developing new cloud service patterns, scripting, data management/enrichment, consolidation of metadata, user-interface development, container (e.g. K8s) integration & design, encrypting data at rest & in transit • Technically integrate/tailor/engineer/design and maintain cybersecurity relevant features & capabilities using well More ❯
criteria of the effort Analyze data lineage documentation and support the creation of data and system flow documentation and process maps Support the updates for key data capabilities including metadata repositories, data dictionaries, data lineage, business process maps, and training materials with the data and system process flows Partner with business data owners required to ensure maintenance of key artifacts More ❯
Exceptional communication skills with the ability to translate complex technical concepts to non-technical stakeholders. Preferred Qualifications Experience with headless CMS architectures and decoupled publishing platforms. Knowledge of SEO, metadata strategies, and content syndication standards. Background working with or supporting editorial or newsroom environments. Technical Skills : Proficiency in programming languages (e.g., Python, Java, SQL, etc.) Familiarity with cloud-based technologies More ❯
customer-facing products. Key Responsibilities Lead the architecture and design of robust, scalable data solutions in a modern Azure environment Drive the development of logical and conceptual data models, metadata frameworks, and architecture roadmaps Support and influence the entire data lifecyclefrom ingestion and transformation to analytics and product delivery Collaborate closely with the CDO and engineering team to define the … as a Senior Data Architect or Lead Data Engineer in a modern cloud environment Hands-on knowledge of Azure data services and strong SQL skills Extensive experience in designing metadata-driven pipelines and data warehouse architectures Comfortable defining as-is and to-be states across data ecosystems Strong communication skills with the ability to liaise across technical and business teams More ❯
business stakeholders to shape and deliver the long-term vision for enterprise-grade data architecture. Key Responsibilities Design and implement scalable database solutions, conceptual and logical data models, and metadata-driven pipelines Own the full data lifecycle from initial design and ingestion to governance and performance optimisation Lead architectural direction across the platform, helping the business move from legacy to … Databricks-based) Data modelling using Kimball and/or Data Vault methodology Experienced with both structured and unstructured data environments Experience with MDM, data lineage, data cataloguing, and enterprise metadata management Familiar with Agile, SCRUM, and project frameworks like PRINCE2 Requirments Proven experience as a Data Architect or Lead Data Engineer in enterprise or complex environments Strong ability to communicate More ❯
curated data layers efficiently. Build scalable ETL/ELT processes with Azure Data Factory and PySpark. Support data governance initiatives using tools like Azure Purview and Unity Catalog for metadata management, lineage, and access control. Ensure consistency, accuracy, and reliability across data pipelines. Collaborate with analysts to validate and refine datasets for reporting. Apply DevOps and CI/CD best More ❯
functional teams effectively. It would be great if you have: Azure Data Fundamentals DP-900 certification. Azure Fundamentals AZ-900 certification. Good knowledge of data governance, data quality, security, metadata cataloguing and Master Data Management. Machine Learning and AI development experience Benefits: Up to 10% bonus (based on company and personal performance). An employer pension scheme 25 days holiday More ❯
data management, covering one or more of the following areas: Data governance Data ethics Data modelling Data architecture Data quality Master data management Experience of data lineage, taxonomies and metadata management Strong problem solving and analytical thinking ability, in order to adapt to different client projects and scenarios Excellent written and verbal communication skills that allow you build credibility with More ❯
lead stakeholder on business initiatives that impact the data warehouse. Ensure data accuracy, consistency, and integrity across warehouse and source systems. Maintain and evolve the data dictionary and associated metadata for the warehouse and ETL systems. Mentor and support team members to build a high-performing, resilient data function. Keep up to date with industry developments and maintain relevant technical More ❯
based in London (Liverpool St) About Our Client This organisation is a large entity operating within the financial services sector. Job Description Strong expertise in data governance, data quality, metadata management, data profiling, analysis, and data management tools. Lead the implementation of data governance practices within the Risk and Finance domains, aligning with BCBS239 regulatory standards. Owns the end-to More ❯
Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business-wide metadata catalogues Collaborate with BI, Change, and Product teams to align data outputs to needs Support groundwork for future data science and machine learning initiatives The successful applicant will be proficient More ❯
implementing cloud data migration and storage patterns on one or more of AWS, GCP and Microsoft Azure Experience implementing and integrating data management platforms for data cataloguing, classification and metadata management. Experience designing and developing data privacy, security and entitlements frameworks for cloud provider ecosystems (AWS, Azure, GCP). Good understanding of cloud networking architecture, operations, automation and cost management. More ❯
and product teams to drive strategic initiatives and promote shared data standards and best practices. YOUR SKILLS AND EXPERIENCE ESSENTIAL CRITERIA: Demonstrated expertise in data management, including governance, quality, metadata, and lifecycle practices, with experience integrating SAP BW/4HANA and AWS analytics for unified finance data solutions. Technically proficient in SQL, Python, and data modelling, with strong knowledge of More ❯
spotting opportunities to reduce complexity and cost. Help define and manage best practices for our Data Warehouse. This may include payload design of source data, logical data modelling, implementation, metadata and testing standards. Set standards and ways of working with data across Monzo, working collaboratively with others to make it happen. Take established best practices and standards defined by the More ❯
spotting opportunities to reduce complexity and cost. Help define and manage best practices for our Data Warehouse. This may include payload design of source data, logical data modelling, implementation, metadata and testing standards. Set standards and ways of working with data across Monzo, working collaboratively with others to make it happen. Take established best practices and standards defined by the More ❯
The Role: Sitting at the heart of our data transformation, this role supports the development, implementation, and operation of Data Management Frameworks . You’ll be working with metadata management, data harmonisation, quality score carding, data lineage, and profiling —all critical for building robust data governance. What You’ll Be Doing: ✅ Applying SAS and other analytical tools to extract insights … data challenges Who You Are: 🔥 Passionate about data quality and continuously improving processes 📈 Experienced in SAS , SQL, Power BI, and other analytics tools 🛠️ Skilled in data governance, lineage, and metadata management 📚 Familiar with regulatory frameworks like GDPR, BCBS, CCPA More ❯
The Role: Sitting at the heart of our data transformation, this role supports the development, implementation, and operation of Data Management Frameworks . You’ll be working with metadata management, data harmonisation, quality score carding, data lineage, and profiling —all critical for building robust data governance. What You’ll Be Doing: ✅ Applying SAS and other analytical tools to extract insights … data challenges Who You Are: 🔥 Passionate about data quality and continuously improving processes 📈 Experienced in SAS , SQL, Power BI, and other analytics tools 🛠️ Skilled in data governance, lineage, and metadata management 📚 Familiar with regulatory frameworks like GDPR, BCBS, CCPA More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
data strategies and governance Deep knowledge of contextual, logical, and physical data modelling Hands-on experience with cloud data migration and architecture (AWS, Azure, or GCP) Strong grasp of metadata management, data privacy/security frameworks, and cloud ecosystem best practices Financial services project experience preferred, with stakeholder-facing communication skills Comfortable operating across complex environments, leading multiple workstreams Preferred … data, risk/compliance platforms, or financial crime data architectures An understanding of cloud cost management, automation, and network architecture Tech Domains Cloud Platforms: AWS, Azure, GCP Data Management: Metadata tools, privacy frameworks, security architecture Modelling: ER, Contextual, Semantic, Logical/Physical Orchestration & Integration: API design, DevOps support, cloud-native workflows Why Join? Work on high-impact transformation projects with More ❯
Employment Type: Full-Time
Salary: £110,000 - £118,000 per annum, Inc benefits
information governance, information security, etc) all to ensure successful outcomes. Ensure that all engineering and operational activity is adheres to our IT service and operational policy, processes, and procedures. Metadata management Design, establish and operate an appropriate metadata repository Lead continual improvement changes to metadata repositories, and setup robust governance processes to keep repositories up to date Understand a range … of tools for storing and working with metadata Advise less experienced members of the team about metadata management Ownership You take accountability for issues that occur and be proactive in searching for potential problems. You know how to achieve excellent user outcomes and embed this into the work of your team and service. Problem resolution (data) Anticipate problems and defend More ❯