As a Data Architect, you'll design scalable, future-proof data architectures and provide technical leadership across transformation programmes: Architecture & Design: Design enterprise data architectures for Snowflake and Azure cloud ecosystems Create dimensional data models (star schema, snowflakeschema, Data Vault) supporting analytics and BI Define data strategy and governance frameworks – lineage, cataloging, security … challenges Conduct architecture reviews and design authority for data solutions Mentor engineers on best practices in data modeling, performance optimization, and cloud architecture ESSENTIAL SKILLS: Data Modeling expertise - Star schema, dimensional modeling, Data Vault, logical/physical design Snowflake Cloud Data Platform - Architecture design, performance tuning, cost optimization, governance Azure Data Factory - Pipeline architecture, orchestration patterns, best practices More ❯
ETL) processes, leveraging tools such as Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi, or scripting languages (Python, PySpark, SQL). Solid understanding of data warehousing and data modelling techniques (Star Schema, SnowflakeSchema). Familiarity with security frameworks (GDPR, HIPAA, ISO 27001, NIST, SOX, PII) and AWS security features (IAM, KMS, RBAC). Knowledge of Azure data engineering More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
experience as a Data Engineer working with Azure. Strong skills in Azure Data Factory, Synapse Analytics, Databricks, SQL Database, and Azure Storage. Excellent SQL and data modelling (star/snowflake, dimensional modelling). Knowledge of Power BI dataflows, DAX, and RLS. Experience with Python, PySpark, or T-SQL for transformations. Understanding of CI/CD and DevOps (Git, YAML More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
experience as a Data Engineer working with Azure. Strong skills in Azure Data Factory, Synapse Analytics, Databricks, SQL Database, and Azure Storage. Excellent SQL and data modelling (star/snowflake, dimensional modelling). Knowledge of Power BI dataflows, DAX, and RLS. Experience with Python, PySpark, or T-SQL for transformations. Understanding of CI/CD and DevOps (Git, YAML More ❯
or BI Analyst in mid-to-large data environments. Advanced SQL , with significant hands-on experience using Google BigQuery for analytics at scale. Strong data modelling skills (star/snowflakeschema, optimisation, partitioning, clustering). Experience building dashboards using Looker Studio, Looker, Power BI, or Tableau . Proficiency working with large datasets and performance-tuning analytical queries. Strong More ❯
DAX, M, advanced modelling techniques, DirectQuery, and Import. Strong SQL skills, with the ability to write complex queries for analysis and integration. Experience in dimensional modelling, including star and snowflakeschema design. Demonstrated capability in time-travel and snapshot reporting methodologies. Practical experience with integrating data from Dataverse and Databricks into reporting workflows. Experience & Knowledge Proven experience as More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
Fivetran- Version Control: Git WHAT YOU NEED Essential: 4+ years in Data/Analytics Engineering or BIStrong SQL skills (production-level)DBT experience (essential)Cloud data warehouse experience (BigQuery, Snowflake, Redshift)Data modeling expertiseUser acquisition or marketing analytics backgroundMobile app or gaming industry experience (highly preferred)Strong communication skillsSelf-starter who works autonomouslyDegree in Engineering, Computer Science, Maths, or More ❯
on position involves architecting and optimising scalable data pipelines to support advanced analytics, AI/ML initiatives, and actionable insights across the organisation.You'll take full ownership of the Snowflake platform implementation and adoption, ensuring it becomes the central hub for trusted, secure, and high-performing data. Acting as the technical authority, you'll define best practices, establish governance … to maximise platform value.This is an opportunity to shape the data landscape and deliver solutions that empower decision-making and innovation. Key Responsibilities Platform Leadership: Design, implement, and manage Snowflake as the enterprise data hub, ensuring scalability, security, and performance. Data Architecture & Strategy: Define frameworks for ingestion, replication, storage, and transformation across diverse data sources. Pipeline Development: Build efficient … ELT pipelines using tools such as DBT and Python, integrating operational, financial, and network data. Performance Optimisation: Configure Snowflake warehouses and partitioning strategies for cost efficiency and speed. Governance & Compliance: Implement data quality, lineage, and access control aligned with regulatory and security standards. Innovation: Drive adoption of advanced Snowflake features (Snowpark, Streams, Tasks, Secure Data Sharing) to enhance More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
standards for clients across financial services and energy, helping to shape data strategies and improve architecture maturity. ROLE AND RESPONSIBILITIES Design and deliver relational (3NF) and dimensional (star/snowflake) data models Define modelling frameworks and best practices across client engagements Work closely with architecture teams to ensure alignment with data strategies Support metadata management, governance, and documentation processes More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
Stevenage, Hertfordshire, England, United Kingdom Hybrid/Remote Options
Akkodis
tools such as Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi, or scripting languages like Python, PySpark, and SQL. A solid understanding of data warehousing and modelling techniques, including Star and Snowflake schemas, is required. Ideally you will also have a comprehensive knowledge of AWS glue. To succeed, you will demonstrate proven experience in data engineering, data migration, and ETL development More ❯
shape and maintain a modern enterprise data platform. In this role, you'll design and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation.You'll translate strategic data architecture into robust technical solutions, ensuring the platform is reliable, performant, and well-structured. You … modelled data to support decision-making, operational reporting, and future AI/ML capabilities. Key Responsibilities Data Engineering Delivery: Build and maintain high-quality data pipelines and models in Snowflake to support analytics and reporting needs. Architecture Implementation: Apply defined data architecture standards to ingestion, transformation, storage, and optimisation processes. Pipeline Development: Develop robust ELT/ETL workflows using … dbt and orchestration tools, ensuring reliability and maintainability. Performance & Cost Optimisation: Configure Snowflake warehouses and implement query optimisation techniques for efficiency. Data Quality & Governance: Apply data quality checks, lineage tracking, and security standards aligned with InfoSec and regulatory requirements. Feature Adoption: Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation and accessibility. Collaboration More ❯
such as ERWin, Sparx, or RSA Your Skills and Experience 2-4 years of experience in data modelling or data architecture Exposure to relational and dimensional modelling (3NF, star, snowflake) Strong SQL and data warehousing knowledge Experience in financial services or consulting is highly desirable Strong communication and stakeholder engagement skills More ❯
to produce entity relationship diagrams Programming/scripting experience (Python) Familiarity with a Software Development Cycle Management Tool (Azure DevOps, JIRA...) Advantageous skills and experience: Data modelling (relational, star, snowflake schemas) Azure Data Warehouse/Synapse Source control (git) C# programming NoSQL databases (Mongo/CosmosDB) JSON, XML RestAPI PowerBI Non-technology: Must be a team player Should have More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production More ❯
london, south east england, united kingdom Hybrid/Remote Options
Decentralized Masters
Masters What will you be doing? As the Analytics Engineer, you'll be at the center of how data flows across Decentralized Masters. You'll architect and own our Snowflake data warehouse, design automated data pipelines from marketing platforms, CRM systems, and payment tools, and build dashboards that turn complex data into clear, actionable insights for our leadership team. … engineering, SQL development, and business intelligence, offering a chance to make a direct impact on company-wide decision-making as we scale from $50M to unicorn status. Key Responsibilities Snowflake Data Warehouse Management Architect and manage our Snowflake environment (schemas, warehouses, users, roles) Design and implement data models using Bronze → Silver → Gold layer architecture Write advanced SQL to … transform, clean, and aggregate data from multiple sources Optimize query performance and costs in Snowflake Implement data governance and security best practices Data Pipeline & Integration Management Configure and maintain ETL/ELT tools (Fivetran, Airbyte, or similar) to sync data into Snowflake Connect data sources: HubSpot (CRM/marketing), ad platforms (Meta, Google, TikTok), Stripe (payments), and analytics More ❯
initiatives Your Skills and Experience 5+ years of experience in data modelling or architecture, ideally within financial services or consulting Strong knowledge of relational and dimensional design (3NF, star, snowflake) Proficient in ERWin, Sparx, or similar tools Experience working with semi-structured data (JSON, XML, Parquet) Excellent communication and client-facing skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
DBT Cloud Data Engineer (Snowflake, Azure) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced DBT Data Engineer to join a major insurance client engagement . The role focuses on scaling a Snowflake Data Warehouse and expanding its DBT Cloud modelling capabilities to support … NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees) You'll be part of a growing data engineering function focused on DBT model development , Snowflake optimisation , and data governance across multiple data domains. This role suits a technically strong engineer with proven DBT Cloud experience who can take ownership of data pipelines and drive … in transformation, testing, and automation. Key Skills & Experience Deep DBT Cloud expertise, including models, macros, tests, documentation, and CI/CD integration. Hands-on experience developing and optimising in Snowflake Cloud Data Warehouse (schemas, warehouses, security, time travel, performance tuning). Familiarity with Snowflake cost monitoring, governance, replication , and environment management. Strong understanding of data modelling (star/ More ❯
This role demands proven, hands-on experience in the following areas: Foundational Modeling: Absolute mastery of OLAP/OLTP Modeling and extensive experience in Dimensional Data Modeling (Star/Snowflake schemas). Architecture Design: Expert in Data Architecture and designing modern Data Lakehouse Architecture . Cloud Platform: Proven architectural experience with GCP is mandatory. Data Governance: Strong conceptual understanding More ❯
integrate and validate models Maintain clear communication with stakeholders and manage their expectations Job Requirements: Proven experience in data modeling with a strong understanding of dimensional modeling, star and snowflake schemas, and normalisation principles Experienced in stakeholder management and comfortable navigating ambiguous requirements Effective communication skills with the ability to engage and manage business stakeholders Familiarity with MSFT Purview More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
requirements and support business deliverables.* Collect, transform, and process datasets from various internal and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and … technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum 3 years' experience working as a Data Engineer in a commercial environment.* Strong commercial experience with Snowflake and DBT.* Proficient in SQL and experienced in data modelling within cloud data warehouses.* Familiarity with cloud platforms such as AWS or Azure.* Experience with Python, Databricks, or related More ❯
while also helping define platform standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE … A successful Senior Data Engineer will bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending More ❯
london, south east england, united kingdom Hybrid/Remote Options
Xpand Group
landscapes (MDM, CRM, ERP, Cloud DWH) through pragmatic and scalable architectural blueprints. Cloud Data Platform Leadership Design and implement high-performance cloud data platforms (AWS, Azure, Google Cloud, Databricks, Snowflake), overseeing data modelling, integration, transformation, and DevOps pipelines. Integrated Solution Architecture Design seamless integrations between cloud data platforms, AI/GenAI platforms, and business-critical systems (e.g., MDM, CRM … as a Data Architect, leading design and implementation of complex cloud-based data ecosystems. Solid engineering background with hands-on data platform implementation experience (AWS, Azure, GCP, Databricks, or Snowflake). Proven ability to evaluate data architecture decisions, influence business and IT stakeholders, and define strategic data direction. Strong understanding of coding best practices, code quality tools (e.g., SonarQube More ❯