scalable, future-proof data architectures and provide technical leadership across transformation programmes: Architecture & Design: Design enterprise data architectures for Snowflake and Azure cloud ecosystems Create dimensional data models (starschema, snowflake schema, Data Vault) supporting analytics and BI Define data strategy and governance frameworks – lineage, cataloging, security, compliance Lead solution design for data warehouse, data lake … technical challenges Conduct architecture reviews and design authority for data solutions Mentor engineers on best practices in data modeling, performance optimization, and cloud architecture ESSENTIAL SKILLS: Data Modeling expertise - Starschema, dimensional modeling, Data Vault, logical/physical design Snowflake Cloud Data Platform - Architecture design, performance tuning, cost optimization, governance Azure Data Factory - Pipeline architecture, orchestration patterns, best More ❯
responsibilities: Partner with business teams to understand requirements and deliver meaningful, actionable insights. Design and implement data pipelines from APIs and relational sources. Model data effectively using Kimball/StarSchema methodologies. Develop dashboards, reports, and automated integrations in Power BI. Support the onboarding of data from newly acquired businesses. Contribute to data strategy, process improvement, and best … skills and experience: Advanced SQL proficiency (joins, CTEs, window functions). Strong Power BI skills, including semantic modelling, DAX, and report design. Experience building data warehouse solutions (Kimball/StarSchema). Excellent communication and stakeholder engagement skills. Proactive, organised, and adaptable with a genuine team spirit. Desirable: Knowledge of Medallion Architecture and Data Lakehouse concepts. Working knowledge More ❯
loading (ETL) processes, leveraging tools such as Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi, or scripting languages (Python, PySpark, SQL). Solid understanding of data warehousing and data modelling techniques (StarSchema, Snowflake Schema). Familiarity with security frameworks (GDPR, HIPAA, ISO 27001, NIST, SOX, PII) and AWS security features (IAM, KMS, RBAC). Knowledge of Azure data More ❯
with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (StarSchema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. More ❯
environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (starschema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ More ❯
data streaming Strong proficiency in SQL and Python Familiarity with Azure Data Services and CI/CD pipelines in a DevOps environment Solid understanding of data modelling techniques (e.g., StarSchema) Excellent problem-solving skills and a high attention to detail Desirable: Azure Data Engineer certification Experience working with unstructured data sources (e.g. voice) Exposure to Power BI More ❯
SQL Server : Strong T-SQL skills with experience querying complex relational databases DAX : Proficiency in writing advanced DAX formulas for time intelligence, filtering, and aggregations Data Modelling : Experience with starschema design and data warehouse concepts ERP Systems : Previous experience working with ERP data structures, particularly food industry systems (eg, Sage, SAP, Microsoft Dynamics, or similar) Key Responsibilities More ❯
Modeling : Deep understanding of data warehousing concepts and best practices. Experience of, and ability to, transform raw transactional data into well-structured analytics-ready datasets using schemas like the starschema (Kimball methodology) Data Quality & Governance : build trust in data by implementing data validation checks, testing frameworks, and clear documentation within your pipelines Experience in the following areas More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Davies Talent Solutions
Collaborate with stakeholders to understand requirements and deliver practical solutions. Ensure data quality and consistency across reporting pipelines. What We’re Looking For Power BI Expertise: Advanced data modelling (STARschema, relationships, filter direction). Strong DAX skills for measures and calculations. Experience optimizing performance for large datasets. Excel Mastery: Advanced formulas (SUMIFS, COUNTIFS, text functions). Dashboard More ❯
will: Digest data requirements, gather and analyse large scale structured data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Starschema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Understand and produce ‘Source to Target mapping More ❯
Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver … star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets . Apply governance, lineage and fine-grained permissions via Unity Catalog . Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory . Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI More ❯
Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver … star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets . Apply governance, lineage and fine-grained permissions via Unity Catalog . Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory . Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI More ❯