the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google Cloud Platform environments, primarily … and implement robust, well-documented and performant dbt models that serve as a single source of truth for business reporting.• Implement and champion data quality testing, documentation standards, and modelling best practices within dbt projects.• Troubleshoot and resolve any issues or errors in the data pipelines.• Stay updated with the latest cloud technologies and industry best practices to continuously … be expecting to see: • Expert level proficiency in SQL.• Deep, practical experience of building and architecting data models with dbt.• A strong understanding of data warehousing concepts and data modelling techniques (eg, Kimball, DimensionalModelling, One Big Table).• Solid, hands-on experience within the Google Cloud Platform, especially with BigQuery.• Proven experience working directly with business More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Gleeson Recruitment Group
processes and reduce reliance on spreadsheets and legacy tools. Collaborate with stakeholders to define reporting needs, translating complex requirements into actionable technical specifications. Provide expert guidance on data architecture, modelling, and analytics approaches. Contribute to policies on data governance and promote best practices across the organisation. Mentor junior BI developers and train stakeholders in the effective use of BI … skills and experience: Significant expertise in Power BI for data visualisation and reporting. Hands-on experience with Azure services (Azure Data Factory, Azure SQL Server). Strong understanding of dimensionalmodelling (e.g., Kimball methodology). Proficiency in sourcing, manipulating, and interpreting complex datasets. Strong analytical mindset with excellent attention to detail and accuracy. Experience managing stakeholders, gathering business More ❯
Employment Type: Permanent
Salary: £55000 - £65000/annum Final salary pension
Manchester, North West, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
delivering enterprise-level solutions. Essential Skills: 5 years' experience in Data Engineering. Strong expertise in Databricks, Azure Data Factory, Azure SQL, and Azure Synapse/DW. Solid understanding of dimensionalmodelling (Kimball, Star Schema) and EDW solutions. Experience working with structured and unstructured data. Familiarity with cloud and DevOps practices - ie Azure, CI/CD pipelines, scaling, cost More ❯
valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensionalmodelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensionalmodelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
a team of data engineers, analytics engineers, data scientists and AI specialists to design and evolve scalable data platforms and modern data products that enable self-service analytics, advanced modelling, and AI-driven decision-making across our insurance business. What you’ll do: Design and manage scalable cloud data platforms (Databricks on AWS) across development, staging, and production environments … ensuring reliable performance and cost efficiency. Integrate and model data from diverse sources – including warehouses, APIs, marketing platforms, and operational systems – using DBT, Delta Live Tables, and dimensionalmodelling to deliver consistent, trusted analytics. Enable advanced AI and ML use cases by building pipelines for vector search, retrieval-augmented generation (RAG), feature engineering, and model deployment. Ensure security … to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. Qualifications What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business More ❯
into a management Very strong technical skills that will include - SQL, SSIS, SSRS, SAS - Power BI, Power Platform - Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯
team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯
them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
PowerBI/Tableau developer, with a solid track record in delivering complex solutions. They will be proficient in desktop and cloud, with expert level experience in calculated metrics, data modelling, and optimising reports. You will Lead the design and development of visualisation dashboards that meet complex data requirements and present information in a clear and concise manner Drive data … design with business users Ability to perform data blending, data aggregation and complex calculated fields when required Possess hands on experience with SQL and knowledge of data warehousing and dimensionalmodelling will be advantageous Experience using large data platforms such as Snowflake, Databricks or similar Exposure to other visualisation platforms is helpful, but not essential Required Characteristics Cares More ❯
quality dimensions and integrate metrics with centralized tools that measure data products' quality and reliability in the organization Qualifications Understanding of data engineering (including SQL, Python, Data Warehousing, ETL, DimensionalModelling, Analytics) Understanding of cloud data infrastructure elements, and ideally AWS (Redshift, Glue, Athena, S3) and understanding of existing governance frameworks of data quality and their dimensions (DAMA More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
data quality dimensions and integratemetrics with centralized tools that measure data products quality and reliability in the organization Qualifications Understanding of data engineering (including SQL , Python , Data Warehousing , ETL , DimensionalModelling , Analytics ) Understanding of cloud data infrastructure elements, and ideally AWS (Redshift, Glue, Athena, S3) and understanding of existing governance frameworks of data quality and their dimensions (DAMA More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
at enterprise level or in large-scale delivery. Proven experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure, GCP or AWS. Data Modelling using Kimball, 3NF or Dimensional methodologies Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be … Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of unity catalog alongside core More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment Ltd
to deliver scalable, modern data solutions. What You’ll Do Lead the technical design and delivery of the organisation’s analytic data platform Define and own data architecture principles, modelling standards, and governance practices Collaborate with architects, product owners, and external partners to shape scalable data solutions Manage and mentor a high-performing team of Data Engineers Support integration … strategic planning What You’ll Bring Proven experience designing and architecting complex, high-volume data systems Strong hands-on knowledge of cloud platforms (AWS preferred), Snowflake, DBT, and data modelling tools Experience with data integration, ETL, and API-based data exchange Familiarity with Data Vault, dimensionalmodelling, and master data management Excellent communication skills and ability to More ❯
engagement and communication skills Essential technical skills: SQL, Power BI, Power Automate Preferred experience: A/B testing, Python, Databricks (especially for manager role), ETL/ELT pipelines, data modelling (marts, dimensional models), Git/version control Experience in pricing analytics and Mastercard Test & Learn platform is a bonus Details Start date: ASAP Duration: Approximately 3 months, with More ❯
stakeholders throughout complex projects involving data collection, linkage, and analysis and process automation. Navigating and resolving technical and governance challenges to enable more effective and secure data sharing. Build dimensional data models and tools that empower analysts and decision-makers. Support analysts with software development practices using R, Python, SQL, and Git. Lead on data integration, quality assurance, and … Analysts from Base and Enterprise layers provided by other Data Professionals, linking across datasets while tackling data quality gaps Proactive and responsive development and maintenance of flexible, quality-assured, dimensional models for use by analysts, ensuring dimensions are conformed wherever possible Developing tools to collect, store and present data, metadata, and documentation that enable people to gather and apply … or postgraduate qualification in a relevant field such as data analysis, data science, or computer science, or equivalent professional experience (3+ years) Experience Essential Strong knowledge of SQL, data modelling, and the use of Denodo or similar data virtualisation and integration software to support Reproducible Analytical Pipelines and ETL/ELT processes Experience in data governance, particularly within the More ❯
engineer or in a similar role, handling large datasets and complex data pipelines. Previous experience managing a team Experience with big data processing frameworks and technologies. Experience with data modelling and designing efficient data structures. Experience with data integration and ETL (Extract, Transform, Load) processes. Experience in data cleansing, validation, and enrichment processes. Strong programming skills in languages such … as Python, Java, or Scala. Knowledge of data warehousing concepts and dimensional modelling. Understanding of data security, privacy, and compliance requirements Proficiency in data integration and ETL tools Strong analytical skills and the ability to understand complex data structures. Capable of identifying data quality issues, troubleshooting problems, and implementing effective solutions. Disclosure and Barring Service Check This post is More ❯