Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … strong understanding of Insurance domain concepts—including Specialized Insurance, London Market, and Regulatory Data—is essential for success in this role. Your role In the role of a Data Modelling Lead, you will be responsible for leading the design and transformation of enterprise data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data … teams to translate complex reporting and analytics needs into efficient, well-governed data models. Your leadership will be key in identifying gaps, driving improvements, and implementing best practices in dimensionalmodelling, normalization, and metadata management. A strong understanding of the Insurance domain—including Specialized Insurance, London Market operations, and Regulatory Data—is essential. You will apply this domain More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … strong understanding of Insurance domain concepts—including Specialized Insurance, London Market, and Regulatory Data—is essential for success in this role. Your role In the role of a Data Modelling Lead, you will be responsible for leading the design and transformation of enterprise data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data … teams to translate complex reporting and analytics needs into efficient, well-governed data models. Your leadership will be key in identifying gaps, driving improvements, and implementing best practices in dimensionalmodelling, normalization, and metadata management. A strong understanding of the Insurance domain—including Specialized Insurance, London Market operations, and Regulatory Data—is essential. You will apply this domain More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
a team of data engineers, analytics engineers, data scientists and AI specialists to design and evolve scalable data platforms and modern data products that enable self-service analytics, advanced modelling, and AI-driven decision-making across our insurance business. What you’ll do: Design and manage scalable cloud data platforms (Databricks on AWS) across development, staging, and production environments … ensuring reliable performance and cost efficiency. Integrate and model data from diverse sources – including warehouses, APIs, marketing platforms, and operational systems – using DBT, Delta Live Tables, and dimensionalmodelling to deliver consistent, trusted analytics. Enable advanced AI and ML use cases by building pipelines for vector search, retrieval-augmented generation (RAG), feature engineering, and model deployment. Ensure security … contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business More ❯
Preston, Lancashire, England, United Kingdom Hybrid/Remote Options
Circle Recruitment
experience as a Senior or Lead data engineer Experience handling large datasets, complex data pipelines, big data processing frameworks and technologies AWS or Azure cloud experience Experience with data modelling, data integration ETL processes and designing efficient data structures Strong programming skills in Python, Java, or Scala Data warehousing concepts and dimensionalmodelling experience Any data engineering … in the office). To apply, press apply now or send your CV to Keywords: Lead Data Engineer/Azure Databricks/AWS/ETL/Python/data modelling Flexible working - Preston - Blackpool - Manchester - Warrington - Liverpool - Bolton - Blackburn Circle Recruitment is acting as an Employment Agency in relation to this vacancy. Earn yourself a referral bonus if you More ❯
particular focus on enhancing fan engagement through digital platforms. Key Responsibilities Design and develop ETL/ELT pipelines in Azure and Databricks, ensuring reliability and performance. Construct Kimball-style dimensional models to support analytics and reporting. Implement automated testing for data quality assurance and validation. Ensure compliance with data governance, legal, and regulatory standards . Collaborate with the wider More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Travelex
and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Experience designing scalable data models, warehouse/lakehouse architectures, and dimensionalmodelling for analytics. Confidence in coding, scripting, configuring, versioning, debugging, testing and deploying. Skilled in optimising queries, pipelines, and storage for cost and performance efficiency. Good understanding of More ❯
and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Experience designing scalable data models, warehouse/lakehouse architectures, and dimensionalmodelling for analytics. Confidence in coding, scripting, configuring, versioning, debugging, testing and deploying. Skilled in optimising queries, pipelines, and storage for cost and performance efficiency. Good understanding of More ❯
Preston, Lancashire, England, United Kingdom Hybrid/Remote Options
Circle Recruitment
as a lead data engineer/Data Engineering Manager with some management experience Experience handling large datasets, complex data pipelines, big data processing frameworks and technologies Experience with data modelling, data integration ETL processes and designing efficient data structures Strong programming skills in Python, Java, or Scala Data warehousing concepts and dimensionalmodelling experience Any data engineering … week in the office). To apply, press apply now or send your CV to Keywords: Head of Data Engineering/Azure Databricks/ETL/Python/data modelling Flexible working - Preston - Blackpool - Manchester - Warrington - Liverpool - Bolton - Blackburn Circle Recruitment is acting as an Employment Agency in relation to this vacancy. Earn yourself a referral bonus if you More ❯
. Deep expertise in the Microsoft Data Stack (Azure, PowerBI, SQL) and Cloud Data Platforms ( Fabric, Databricks, or Snowflake ). High proficiency in SQL and Python . Expertise in DimensionalModelling and defining robust Data Governance . Proven ability to lead strategy, mentor teams, and remain comfortable with hands-on technical work when required. What's On Offer More ❯
all data projects while engaging with senior stakeholders to deliver impactful solutions. Specifically, you can expect to be involved in the following: Designing and implementing both relational (3NF) and dimensional (star/snowflake) data models for large-scale financial systems and analytical platforms. Working hands-on with ETL design, metadata management, and data cataloguing , ensuring robust data lineage and … Visio , and handling semi-structured data formats (JSON, XML, Parquet). SKILLS AND EXPERIENCE The successful Data Modeller will have the following skills and experience: Proven experience in data modelling within consulting or financial services environments. Strong expertise in ERWin (or equivalent tools such as Sparx, RSA, Visio). Hands-on experience with relational and dimensionalmodellingMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harnham
all data projects while engaging with senior stakeholders to deliver impactful solutions. Specifically, you can expect to be involved in the following: Designing and implementing both relational (3NF) and dimensional (star/snowflake) data models for large-scale financial systems and analytical platforms. Working hands-on with ETL design, metadata management, and data cataloguing , ensuring robust data lineage and … Visio , and handling semi-structured data formats (JSON, XML, Parquet). SKILLS AND EXPERIENCE The successful Data Modeller will have the following skills and experience: Proven experience in data modelling within consulting or financial services environments. Strong expertise in ERWin (or equivalent tools such as Sparx, RSA, Visio). Hands-on experience with relational and dimensionalmodellingMore ❯
higher) qualification isrequired, and holding Solution Architect or Master Anaplanner status is astrong advantage. Applicants should possess knowledge of the Anaplan IntegratedFinance Planning Application, a solid understanding of multi-dimensionalmodelling, agile methodologies, and Anaplan best practices. Advanced Excelmodel building skills are essential, and experience with reporting systems suchas Qlik or Power BI will be beneficial. Exceptional communication skills arerequired, as More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
ll bring: Experience in analytics, AI/data science, or data product environments - ideally within ad tech, media, or digital. Strong grasp of data warehousing concepts - data flows, ETL, dimensionalmodelling, and data quality. Solid SQL skills and the ability to interrogate complex data from diverse sources. Hands-on understanding of cloud-based data architectures A proven ability More ❯
Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation and accessibility. Collaboration: Work closely with analysts and stakeholders to deliver data products and troubleshoot issues. Analytics Enablement: Implement dimensional models to provide clean, reusable datasets for reporting and advanced analytics. Monitoring & Reliability: Maintain monitoring, alerting, and cost-management processes for Snowflake and pipelines. Continuous Improvement: Contribute to shared … transformations and pipeline automation. Practical experience with Snowflake features and RBAC management. Familiarity with ingestion tools (Airbyte, Fivetran, Hevo) and cloud services (AWS preferred). Solid understanding of data modelling, governance principles, and BI enablement (Power BI). Knowledge of CI/CD and version-controlled development practices in git. Desirable: Experience with enterprise systems (CRM, BSS/OSS More ❯
the right time. Essentially, to ensure you succeed in this role you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to More ❯
new business opportunities and innovative solutions. Participate in RFIs, RFPs, and presales pitches to secure new engagements. Your Profile: Essential skills/knowledge/experience: Extensive experience leading multi-dimensional model development on the Anaplan platform, including configuring existing models, adding new functionality, and optimizing modeling solutions for performance and scalability. Proven track record as the solution architecture lead More ❯
#39;ll design scalable, future-proof data architectures and provide technical leadership across transformation programmes: Architecture & Design: Design enterprise data architectures for Snowflake and Azure cloud ecosystems Create dimensional data models (star schema, snowflake schema, Data Vault) supporting analytics and BI Define data strategy and governance frameworks – lineage, cataloging, security, compliance Lead solution design for data warehouse, data … Conduct architecture reviews and design authority for data solutions Mentor engineers on best practices in data modeling, performance optimization, and cloud architecture ESSENTIAL SKILLS: Data Modeling expertise - Star schema, dimensional modeling, Data Vault, logical/physical design Snowflake Cloud Data Platform - Architecture design, performance tuning, cost optimization, governance Azure Data Factory - Pipeline architecture, orchestration patterns, best practices Azure Databricks More ❯
Business Intelligence Engineer 12 Months Hybrid - London Inside IR35 Up to £500 a day Key Skills: Data modelling, data warehousing, Dashboarding Technical: AWS, SQL Python Key Responsibilities: - Build End to End dashboarding solutions to optimize EU CF Operations using AWS - Data source exploration, data warehousing expansion, horizontal datasets for downstream consumption - Work with customers to build Dashboards with the … using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
central Data Platform, working closely with the Data Management Team. Analysis of normalised transactional data and analytic data sources. Analysis of semi-structured and file-based data extracts. Data modelling to transform operational data into analytic/reporting structures such as Kimball-style multi-dimensional models. 💡Strong SQL, Sigma, PowerBI, Matilion, DBT, Python Essential Skills for the Lead … Data Analyst: Proven Data Analyst experience of at least 5 years, within a Financial Services environment, with particular emphasis on Risk or cross asset derivatives areas. Data modelling, cleansing and enrichment, with experience in conceptual, logical and physical data modelling and data cleansing and standardisation. Familiarity with Data warehousing and analytical data structures. Data quality assurance, validation and More ❯
feedback efficiently in an iterative manner Develop and abide by standard methodologies for scalable and repeatable insights. Translate existing spreadsheets/mockups and business problems into sophisticated Anaplan multi-dimensional models Modify existing models as part of a connected solution, optimization, or to incorporate new functionality Implement standard processes and documentation for Anaplan models Lead data integration and migration More ❯
and data products. Influence and inform decision-makers using compelling data storytelling, strategic analysis, and visual narratives. Act as a subject matter expert in Power BI, including DAX, data modelling, and visualization best practices. Build strong relationships with cross-functional stakeholders, ensuring alignment between BI outputs and business goals. Contribute to the continuous improvement of BI practices, tools, and … work under pressure and work to strict reporting deadlines. Excellent organisational and time management skills. Technical skills: Data Manipulation, Data Validation, Data Cleaning, Row Level Security, Relational Databases Management, Dimensional Modelling. Soft skills: Stakeholder Management, Problem Solving, Communication, Logical Reasoning, Questioning Nature, Dataflows, Data Pipelines, Data Validation, Attention to Detail Programmes: Power BI, SQL, Microsoft Packages Desirable Criteria: Experience More ❯
lead and design data engineering solutions for our client. What You'll Do: Lead technical design using Medallion architecture and Azure ServicesCreate conceptual diagrams, source-to-target mapping, and dimensional data modelsGuide the data engineering team on development with ADF, Databricks, Python, JSON, and YAMLImplement performance optimization strategies during data transformationDeploy code artifacts using GitHub Workflows/CI-CD … stakeholdersOptional: Work with Log Analytics and KQL queries Must-Have Skills: 10+ years of experience in Data Engineering Hands-on experience with PySpark, ADF, Databricks, SQL Strong understanding of dimensional modeling, normalization, schema design, and data harmonization Experience with Erwin and data modeling toolsExcellent communication, problem-solving, and client-facing skills Why Join Us: Work in a dynamic, collaborative More ❯
helping enterprise clients structure, govern, and leverage data effectively. The Role As a Senior Data Modeller, you will act as a hands-on lead across client projects, setting data modelling standards and ensuring delivery across multiple stakeholders. You will design frameworks, define best practice for data structures, and drive consistency across systems while remaining highly involved in technical delivery. … Key Responsibilities Lead on data modelling design and implementation across client programmes Create relational and dimensional models for enterprise data warehouses and analytics platforms Define metadata standards, data catalogues, and lineage documentation Collaborate with architects, engineers, and business stakeholders to ensure data models meet business needs Support data migration, governance, and transformation initiatives Your Skills and Experience 5+ … years of experience in data modelling or architecture, ideally within financial services or consulting Strong knowledge of relational and dimensional design (3NF, star, snowflake) Proficient in ERWin, Sparx, or similar tools Experience working with semi-structured data (JSON, XML, Parquet) Excellent communication and client-facing skills More ❯
across the organisation. Key Responsibilities: Design and maintain scalable data platforms using cloud-native technologies such as Databricks and AWS across development, staging, and production environments Build and govern dimensional data models and semantic layers to power consistent and trusted analytics Integrate data from diverse sources including cloud warehouses, APIs, and operational systems Define semantic layers using dbt and … to ensure performance and cost efficiency Collaborate with cross-functional teams on architecture decisions, technical designs, and data governance standards You Will Have: Proven hands-on expertise in data modelling with a strong track record of designing and implementing complex dimensional models and enterprise-wide canonical data models The ability to translate complex business processes into scalable and More ❯
across the organisation. Key Responsibilities: Design and maintain scalable data platforms using cloud-native technologies such as Databricks and AWS across development, staging, and production environments Build and govern dimensional data models and semantic layers to power consistent and trusted analytics Integrate data from diverse sources including cloud warehouses, APIs, and operational systems Define semantic layers using dbt and … to ensure performance and cost efficiency Collaborate with cross-functional teams on architecture decisions, technical designs, and data governance standards You Will Have: Proven hands-on expertise in data modelling with a strong track record of designing and implementing complex dimensional models and enterprise-wide canonical data models The ability to translate complex business processes into scalable and More ❯