London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
engineers and driving adoption of DevOps and CI/CD best practices within the data function Contribute to the evolution of a modern event-sourcing architecture, enabling efficient data modelling, streaming, and transformation across platforms Collaborate with cross-functional teams - including Business Analysts, Product Owners, and fellow Senior Data Engineers - to translate business needs into robust technical solutions Champion … Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing and CI/CD pipelines Advanced knowledge of SQL and data modelling (dimensionalmodelling, fact/dimension structures, slowly changing dimensions) Managing and querying data lakes or Lakehouse's Excellent communication skills with the ability to explain complex technical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Billigence
warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensionalmodelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with More ❯
warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensionalmodelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with More ❯
Demonstrated experience in data engineering within the Azure cloud environment. Proficiency with Azure Data Factory, Synapse Analytics, Databricks, Azure SQL, and Azure Storage. Strong SQL skills and expertise in dimensionalmodelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing More ❯
across the organisation. What you'll be doing: Build and maintain scalable data pipelines using Azure Data Factory and Databricks Develop transformation workflows in dbt, SQL, and Python Design dimensional models and semantic layers to support analytics use cases Implement automated data quality checks, monitoring, and alerting systems - linking issues to business impact Create reusable data assets and documentation … related roles Strong proficiency in SQL, Python, and dbt (strongly preferred) Hands-on experience with Azure Databricks and cloud-based data platforms (Snowflake experience also valued) Solid understanding of dimensionalmodelling, lakehouse/warehouse design, and modern data stack Familiarity with Git, CI/CD, and software engineering best practices Experience with Power BI (or Tableau/Looker More ❯
in effectively working in a collaborative manner in a large, fast paced environment within a multifunctional technical team Nice to have Languages: PySQL, C#, R Reporting: PowerBI or similar Modelling: SQL Server SSAS Tabular (DAX)/DimensionalModelling/Data Warehousing Development Management: Azure DevOps SQL MI/DB ETL: Active Batch, API, Event Hub, Kafka AI More ❯
Spotfire Lead Consultant (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and optimize data models for …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Staffworx
Spotfire Technical Lead/Hands-On Architect (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
Spotfire Technical Lead/Hands-On Architect (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
integration Integration & Interoperability Experience connecting Microsoft BI tools with Tableau, Amazon QuickSight , or similar platforms Understanding of REST APIs , Power BI Embedded , and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensionalmodelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications More ❯
and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensionalmodelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies … for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end delivery of enterprise BI/data warehouse solutions Reasonable Adjustments: Respect and equality are core values to us. We More ❯
with Data Engineer in the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented … and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensionalmodelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version control (Git), CI/CD pipelines, code testing, and infrastructure as More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Ignite Digital Search Ltd
ETL pipelines using Azure Data Factory, SSIS, and SQL Server Developing and optimising stored procedures and queries for data transformation and integration Building and maintaining data warehouse solutions and dimensional data models Supporting data integration projects and ensuring data quality, accuracy, and consistency Delivering insights through Power BI dashboards and reports Using Python and PowerShell for automation and data … Were Looking For Strong experience with SQL Server (T-SQL, stored procedures, optimisation) Hands-on expertise with Azure Data Factory (ADF) and SSIS Solid understanding of data warehousing and dimensionalmodelling Proven experience building ETL/data integration solutions Exposure to Power BI , Python , and PowerShell is highly desirable Financialservicesorinsuranceexperienceisdesirablebutnotessential Excellent problem-solving skills with a proactive, can More ❯
inflight activities using Microsoft information architecture. Contribute to ERP implementation projects and data migration activities. Ensure data governance and compliance standards are maintained throughout all initiatives. Core Requirements Data Modelling Expertise: Strong experience in developing and maintaining logical and conceptual data models. Microsoft Information Architecture: Solid understanding of Microsoft Purview and related data governance tools ERP Implementation: Hands-on … principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensionalmodelling and data warehouse architecture patterns API Integration: Understanding of REST/SOAP APIs and data service architectures. Data Security: Knowledge of data privacy regulations (GDPR) and security More ❯
in an Agile environment. Expert data reporting and visualisation using Power BI & strong Power Query and DAX skills. Experience of working with large data sets in an enterprise environment. Dimensional model design and implementation. Experience in Microsoft data components including: Azure Analysis Services Databricks Azure SQL Data Warehouse (Synapse Analytics) Tools and techniques for ensuring data quality, security, validation More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
into data solutions. Design, build, and maintain data pipelines and transformations that turn raw data into structured, analysis-ready datasets. Develop and optimise data models (including Data Vault and dimensionalmodelling) to support large-scale analytics. Implement and enforce data quality, governance, and validation frameworks . Write complex SQL to ensure accuracy, performance, and scalability. Collaborate with data … functional teams to deliver insights and recommendations that drive business outcomes. Key Skills: Strong proficiency in SQL for data transformation and analytics. Hands-on experience with Snowflake , dbt , Data Modelling , and Data Vault methodologies. Solid understanding of data warehousing concepts and version control (GitHub) . Experience with cloud platforms (AWS or Azure) and scripting in Python . Familiarity with More ❯
Data Modelling Consultant Location: London (3 days per week onsite) Salary: £50,000 - £65,000 + benefits The Company Join a global management consultancy known for delivering large-scale data and digital transformation projects across financial services, capital markets, and energy. You will be part of a collaborative, fast-growing Data Architecture practice that helps some of the world … with stakeholders to gather and interpret data requirements Support metadata management, documentation, and governance activities Contribute to ETL design and data movement between systems Gain hands-on experience with modelling tools such as ERWin, Sparx, or RSA Your Skills and Experience 2-4 years of experience in data modelling or data architecture Exposure to relational and dimensionalmodelling (3NF, star, snowflake) Strong SQL and data warehousing knowledge Experience in financial services or consulting is highly desirable Strong communication and stakeholder engagement skills More ❯
team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯
Newbury, Berkshire, South East, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … strong understanding of Insurance domain concepts—including Specialized Insurance, London Market, and Regulatory Data—is essential for success in this role. Your role In the role of a Data Modelling Lead, you will be responsible for leading the design and transformation of enterprise data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data … teams to translate complex reporting and analytics needs into efficient, well-governed data models. Your leadership will be key in identifying gaps, driving improvements, and implementing best practices in dimensionalmodelling, normalization, and metadata management. A strong understanding of the Insurance domain—including Specialized Insurance, London Market operations, and Regulatory Data—is essential. You will apply this domain More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … strong understanding of Insurance domain concepts—including Specialized Insurance, London Market, and Regulatory Data—is essential for success in this role. Your role In the role of a Data Modelling Lead, you will be responsible for leading the design and transformation of enterprise data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data … teams to translate complex reporting and analytics needs into efficient, well-governed data models. Your leadership will be key in identifying gaps, driving improvements, and implementing best practices in dimensionalmodelling, normalization, and metadata management. A strong understanding of the Insurance domain—including Specialized Insurance, London Market operations, and Regulatory Data—is essential. You will apply this domain More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
a team of data engineers, analytics engineers, data scientists and AI specialists to design and evolve scalable data platforms and modern data products that enable self-service analytics, advanced modelling, and AI-driven decision-making across our insurance business. What you’ll do: Design and manage scalable cloud data platforms (Databricks on AWS) across development, staging, and production environments … ensuring reliable performance and cost efficiency. Integrate and model data from diverse sources – including warehouses, APIs, marketing platforms, and operational systems – using DBT, Delta Live Tables, and dimensionalmodelling to deliver consistent, trusted analytics. Enable advanced AI and ML use cases by building pipelines for vector search, retrieval-augmented generation (RAG), feature engineering, and model deployment. Ensure security … contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business More ❯
Preston, Lancashire, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
experience as a Senior or Lead data engineer Experience handling large datasets, complex data pipelines, big data processing frameworks and technologies AWS or Azure cloud experience Experience with data modelling, data integration ETL processes and designing efficient data structures Strong programming skills in Python, Java, or Scala Data warehousing concepts and dimensionalmodelling experience Any data engineering … in the office). To apply, press apply now or send your CV to Keywords: Lead Data Engineer/Azure Databricks/AWS/ETL/Python/data modelling Flexible working - Preston - Blackpool - Manchester - Warrington - Liverpool - Bolton - Blackburn Circle Recruitment is acting as an Employment Agency in relation to this vacancy. Earn yourself a referral bonus if you More ❯
and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Experience designing scalable data models, warehouse/lakehouse architectures, and dimensionalmodelling for analytics. Confidence in coding, scripting, configuring, versioning, debugging, testing and deploying. Skilled in optimising queries, pipelines, and storage for cost and performance efficiency. Good understanding of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Travelex
and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Experience designing scalable data models, warehouse/lakehouse architectures, and dimensionalmodelling for analytics. Confidence in coding, scripting, configuring, versioning, debugging, testing and deploying. Skilled in optimising queries, pipelines, and storage for cost and performance efficiency. Good understanding of More ❯