London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
engineers and driving adoption of DevOps and CI/CD best practices within the data function Contribute to the evolution of a modern event-sourcing architecture, enabling efficient data modelling, streaming, and transformation across platforms Collaborate with cross-functional teams - including Business Analysts, Product Owners, and fellow Senior Data Engineers - to translate business needs into robust technical solutions Champion … Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing and CI/CD pipelines Advanced knowledge of SQL and data modelling (dimensionalmodelling, fact/dimension structures, slowly changing dimensions) Managing and querying data lakes or Lakehouse's Excellent communication skills with the ability to explain complex technical More ❯
warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensionalmodelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensionalmodelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with More ❯
Demonstrated experience in data engineering within the Azure cloud environment. Proficiency with Azure Data Factory, Synapse Analytics, Databricks, Azure SQL, and Azure Storage. Strong SQL skills and expertise in dimensionalmodelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing More ❯
months), ensuring that near-term work supports broader programme and client objectives. About You Professional knowledge and experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of … data wareho using concepts, ETL/ELT pipelines and dimensionalmodelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in delivery of More ❯
Leeds, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
Senior Analytics Engineer/Data Modeller role offers the chance to make a significant impact within an innovative, tech-driven organisation. You'll work at the forefront of data modelling, analytics engineering, and AI-enablement, helping shape scalable data products that power enterprise reporting, machine learning, and self-service insights. Senior Analytics Engineer/Data Modeller Responsibilities Design, build … and maintain analytical and AI-ready data marts sourced from the enterprise data lakehouse. Develop semantic models and dimensional structures optimised for BI, dashboarding, and machine learning. Ensure documentation, governance, and data consistency across domains. Collaborate with data engineers to support robust ETL/ELT pipelines and maintain end-to-end data lineage. Deploy analytics engineering solutions using dbt … single source of truth for reporting. Work with business teams to translate complex requirements into scalable, high-impact models. Support agile delivery of data products and continuous improvement of modelling standards. Senior Analytics Engineer/Data Modeller Requirements Proven experience as an Analytics Engineer, Data Modeller, or similar position in a modern cloud-based data environment. Strong SQL skills More ❯
Spotfire Lead Consultant (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and optimize data models for …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
Guildford, Surrey, United Kingdom Hybrid/Remote Options
Sussex Police
Commissioner's Office. Skills & Experience A degree in statistics, mathematics or a statistically-related discipline Demonstrate work experience with Microsoft Fabric, Mulesoft, BI Developer or Data Scientist Strong data modelling background with experience working with multiple data sources both cloud and on-premise Industry experience in delivering Power BI/Microsoft Analytics solutions, with a good grounding in all … BI Report Server or Cloud based Service, or similar visualisation experience. Database including MS-SQL, Oracle Database, Data warehousing, SQL and XML. Microsoft Fabric experience R or Python predictive modelling/classification Experience in building dashboards, reports and cubes using SQL, MDX, DAX, Power Query M Code, Power BI or other visualisation tools. Detailed understanding or experience of how … requirements and understand the business on different levels. Experience in developing new or maintaining and improving existing ETL processes. Experience in establishing and maintaining collaborative relationships with stakeholders. Understanding dimensionalmodelling ideally for Public Sector Good communications, interpersonal and presentation skills in order to influence internal stakeholders. Ability to self-manage and work in a dynamic environment. Experience More ❯
and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensionalmodelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies … for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end delivery of enterprise BI/data warehouse solutions Reasonable Adjustments: Respect and equality are core values to us. We More ❯
own the design and scalability of our data infrastructure, and ensure the team delivers high-quality, impactful solutions. You’ll work across the full data lifecycle from ingestion and modelling to orchestration, monitoring, and analytics enablement, while also mentoring engineers and shaping engineering best practices. As a lead, you’ll partner with our leadership team to make sure our … platforms in a high-growth or start-up environment. Strong expertise in Python and SQL, with deep experience in orchestration frameworks (Dagster, Airflow, Prefect). Advanced knowledge of data modelling and architecture (Kimball dimensionalmodelling, Data Vault etc). Hands-on experience with dbt, modern data warehouses, and AWS. Demonstrated ability to mentor and develop engineers. Desirable More ❯
e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts … delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensionalmodelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing … Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model/Direct Lake optimisation More ❯
e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts … delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensionalmodelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing … Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model/Direct Lake optimisation More ❯
e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts … delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensionalmodelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing … Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model/Direct Lake optimisation More ❯
with Data Engineer in the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented … and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensionalmodelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version control (Git), CI/CD pipelines, code testing, and infrastructure as More ❯
with Data Engineer in the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented … and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensionalmodelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version control (Git), CI/CD pipelines, code testing, and infrastructure as More ❯
Spotfire Lead Consultant (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and optimize data models for …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
complex queries, stored procedures, and CTEs, with experience in query optimization, performance tuning, and designing data models for relational databases. Good understanding of relational databases, data warehousing concepts, and dimensionalmodelling (Kimball or similar). Experience collaborating within agile scrum delivery teams, throughout SDLC and contributing to iterative development cycles. Methodologies. Familiarity with best practices for credential management More ❯
using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and … into insights the business relies on every day. Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights.You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets … fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain More ❯
using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and … insights the business relies on every day. ? Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights. You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. ? What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets … fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain More ❯
inflight activities using Microsoft information architecture. Contribute to ERP implementation projects and data migration activities. Ensure data governance and compliance standards are maintained throughout all initiatives. Core Requirements Data Modelling Expertise: Strong experience in developing and maintaining logical and conceptual data models. Microsoft Information Architecture: Solid understanding of Microsoft Purview and related data governance tools ERP Implementation: Hands-on … principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensionalmodelling and data warehouse architecture patterns API Integration: Understanding of REST/SOAP APIs and data service architectures. Data Security: Knowledge of data privacy regulations (GDPR) and security More ❯
in an Agile environment. Expert data reporting and visualisation using Power BI & strong Power Query and DAX skills. Experience of working with large data sets in an enterprise environment. Dimensional model design and implementation. Experience in Microsoft data components including: Azure Analysis Services Databricks Azure SQL Data Warehouse (Synapse Analytics) Tools and techniques for ensuring data quality, security, validation More ❯
London, England, United Kingdom Hybrid/Remote Options
Focus on SAP
into data solutions. Design, build, and maintain data pipelines and transformations that turn raw data into structured, analysis-ready datasets. Develop and optimise data models (including Data Vault and dimensionalmodelling) to support large-scale analytics. Implement and enforce data quality, governance, and validation frameworks . Write complex SQL to ensure accuracy, performance, and scalability. Collaborate with data … functional teams to deliver insights and recommendations that drive business outcomes. Key Skills: Strong proficiency in SQL for data transformation and analytics. Hands-on experience with Snowflake , dbt , Data Modelling , and Data Vault methodologies. Solid understanding of data warehousing concepts and version control (GitHub) . Experience with cloud platforms (AWS or Azure) and scripting in Python . Familiarity with More ❯
Data Modelling Consultant Location: London (3 days per week onsite) Salary: £50,000 - £65,000 + benefits The Company Join a global management consultancy known for delivering large-scale data and digital transformation projects across financial services, capital markets, and energy. You will be part of a collaborative, fast-growing Data Architecture practice that helps some of the world … with stakeholders to gather and interpret data requirements Support metadata management, documentation, and governance activities Contribute to ETL design and data movement between systems Gain hands-on experience with modelling tools such as ERWin, Sparx, or RSA Your Skills and Experience 2-4 years of experience in data modelling or data architecture Exposure to relational and dimensionalmodelling (3NF, star, snowflake) Strong SQL and data warehousing knowledge Experience in financial services or consulting is highly desirable Strong communication and stakeholder engagement skills More ❯
Bradford, England, United Kingdom Hybrid/Remote Options
Initi8 Recruitment
and deliver robust reporting solutions. You’ll bring: Strong SQL and ETL development skills Experience with Azure Data Factory and Power BI Solid understanding of data warehousing principles and dimensionalmodelling A collaborative mindset — you enjoy sharing ideas and improving processes Why Join You’ll join a growing BI function within a business that genuinely values its engineering More ❯
Newbury, Berkshire, South East, United Kingdom Hybrid/Remote Options
Fdo Consulting Limited
team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯