As a Data Architect, you'll design scalable, future-proof data architectures and provide technical leadership across transformation programmes: Architecture & Design: Design enterprise data architectures for Snowflake and Azure cloud ecosystems Create dimensional data models (star schema, snowflakeschema, Data Vault) supporting analytics and BI Define data strategy and governance frameworks – lineage, cataloging, security … challenges Conduct architecture reviews and design authority for data solutions Mentor engineers on best practices in data modeling, performance optimization, and cloud architecture ESSENTIAL SKILLS: Data Modeling expertise - Star schema, dimensional modeling, Data Vault, logical/physical design Snowflake Cloud Data Platform - Architecture design, performance tuning, cost optimization, governance Azure Data Factory - Pipeline architecture, orchestration patterns, best practices More ❯
within the Azure cloud environment. Proficiency with Azure Data Factory, Synapse Analytics, Databricks, Azure SQL, and Azure Storage. Strong SQL skills and expertise in dimensional modelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing and automation. Understanding of CI/ More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
experience as a Data Engineer working with Azure. Strong skills in Azure Data Factory, Synapse Analytics, Databricks, SQL Database, and Azure Storage. Excellent SQL and data modelling (star/snowflake, dimensional modelling). Knowledge of Power BI dataflows, DAX, and RLS. Experience with Python, PySpark, or T-SQL for transformations. Understanding of CI/CD and DevOps (Git, YAML More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
experience as a Data Engineer working with Azure. Strong skills in Azure Data Factory, Synapse Analytics, Databricks, SQL Database, and Azure Storage. Excellent SQL and data modelling (star/snowflake, dimensional modelling). Knowledge of Power BI dataflows, DAX, and RLS. Experience with Python, PySpark, or T-SQL for transformations. Understanding of CI/CD and DevOps (Git, YAML More ❯
or BI Analyst in mid-to-large data environments. Advanced SQL , with significant hands-on experience using Google BigQuery for analytics at scale. Strong data modelling skills (star/snowflakeschema, optimisation, partitioning, clustering). Experience building dashboards using Looker Studio, Looker, Power BI, or Tableau . Proficiency working with large datasets and performance-tuning analytical queries. Strong More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
Fivetran- Version Control: Git WHAT YOU NEED Essential: 4+ years in Data/Analytics Engineering or BIStrong SQL skills (production-level)DBT experience (essential)Cloud data warehouse experience (BigQuery, Snowflake, Redshift)Data modeling expertiseUser acquisition or marketing analytics backgroundMobile app or gaming industry experience (highly preferred)Strong communication skillsSelf-starter who works autonomouslyDegree in Engineering, Computer Science, Maths, or More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
on position involves architecting and optimising scalable data pipelines to support advanced analytics, AI/ML initiatives, and actionable insights across the organisation.You'll take full ownership of the Snowflake platform implementation and adoption, ensuring it becomes the central hub for trusted, secure, and high-performing data. Acting as the technical authority, you'll define best practices, establish governance … to maximise platform value.This is an opportunity to shape the data landscape and deliver solutions that empower decision-making and innovation. Key Responsibilities Platform Leadership: Design, implement, and manage Snowflake as the enterprise data hub, ensuring scalability, security, and performance. Data Architecture & Strategy: Define frameworks for ingestion, replication, storage, and transformation across diverse data sources. Pipeline Development: Build efficient … ELT pipelines using tools such as DBT and Python, integrating operational, financial, and network data. Performance Optimisation: Configure Snowflake warehouses and partitioning strategies for cost efficiency and speed. Governance & Compliance: Implement data quality, lineage, and access control aligned with regulatory and security standards. Innovation: Drive adoption of advanced Snowflake features (Snowpark, Streams, Tasks, Secure Data Sharing) to enhance More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
Stevenage, Hertfordshire, England, United Kingdom Hybrid/Remote Options
Akkodis
tools such as Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi, or scripting languages like Python, PySpark, and SQL. A solid understanding of data warehousing and modelling techniques, including Star and Snowflake schemas, is required. Ideally you will also have a comprehensive knowledge of AWS glue. To succeed, you will demonstrate proven experience in data engineering, data migration, and ETL development More ❯
shape and maintain a modern enterprise data platform. In this role, you'll design and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation.You'll translate strategic data architecture into robust technical solutions, ensuring the platform is reliable, performant, and well-structured. You … modelled data to support decision-making, operational reporting, and future AI/ML capabilities. Key Responsibilities Data Engineering Delivery: Build and maintain high-quality data pipelines and models in Snowflake to support analytics and reporting needs. Architecture Implementation: Apply defined data architecture standards to ingestion, transformation, storage, and optimisation processes. Pipeline Development: Develop robust ELT/ETL workflows using … dbt and orchestration tools, ensuring reliability and maintainability. Performance & Cost Optimisation: Configure Snowflake warehouses and implement query optimisation techniques for efficiency. Data Quality & Governance: Apply data quality checks, lineage tracking, and security standards aligned with InfoSec and regulatory requirements. Feature Adoption: Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation and accessibility. Collaboration More ❯
such as ERWin, Sparx, or RSA Your Skills and Experience 2-4 years of experience in data modelling or data architecture Exposure to relational and dimensional modelling (3NF, star, snowflake) Strong SQL and data warehousing knowledge Experience in financial services or consulting is highly desirable Strong communication and stakeholder engagement skills More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production More ❯
initiatives Your Skills and Experience 5+ years of experience in data modelling or architecture, ideally within financial services or consulting Strong knowledge of relational and dimensional design (3NF, star, snowflake) Proficient in ERWin, Sparx, or similar tools Experience working with semi-structured data (JSON, XML, Parquet) Excellent communication and client-facing skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
DBT Cloud Data Engineer (Snowflake, Azure) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced DBT Data Engineer to join a major insurance client engagement . The role focuses on scaling a Snowflake Data Warehouse and expanding its DBT Cloud modelling capabilities to support … NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees) You'll be part of a growing data engineering function focused on DBT model development , Snowflake optimisation , and data governance across multiple data domains. This role suits a technically strong engineer with proven DBT Cloud experience who can take ownership of data pipelines and drive … in transformation, testing, and automation. Key Skills & Experience Deep DBT Cloud expertise, including models, macros, tests, documentation, and CI/CD integration. Hands-on experience developing and optimising in Snowflake Cloud Data Warehouse (schemas, warehouses, security, time travel, performance tuning). Familiarity with Snowflake cost monitoring, governance, replication , and environment management. Strong understanding of data modelling (star/ More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
requirements and support business deliverables.* Collect, transform, and process datasets from various internal and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and … technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum 3 years' experience working as a Data Engineer in a commercial environment.* Strong commercial experience with Snowflake and DBT.* Proficient in SQL and experienced in data modelling within cloud data warehouses.* Familiarity with cloud platforms such as AWS or Azure.* Experience with Python, Databricks, or related More ❯
This role demands proven, hands-on experience in the following areas: Foundational Modeling: Absolute mastery of OLAP/OLTP Modeling and extensive experience in Dimensional Data Modeling (Star/Snowflake schemas). Architecture Design: Expert in Data Architecture and designing modern Data Lakehouse Architecture . Cloud Platform: Proven architectural experience with GCP is mandatory. Data Governance: Strong conceptual understanding More ❯
integrate and validate models Maintain clear communication with stakeholders and manage their expectations Job Requirements: Proven experience in data modeling with a strong understanding of dimensional modeling, star and snowflake schemas, and normalisation principles Experienced in stakeholder management and comfortable navigating ambiguous requirements Effective communication skills with the ability to engage and manage business stakeholders Familiarity with MSFT Purview More ❯
GB and above). Build advanced DAX calculations using best practices for performance and maintainability. Perform data transformation using Power Query (M language). Integrate data from Snowflake, SQL Server, SharePoint, and other enterprise sources. Use Tabular Editor and DAX Studio for model optimization, versioning, and troubleshooting. Implement data model and report performance tuning techniques. Develop clean wireframes and … wealth management reporting. Understanding of P&L structures, multi-entity reporting, and operational metrics. If interested please apply for an immediate discussion with talent team #UST Skills Power Bi,Snowflake,Data Architecture More ❯
while also helping define platform standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE … A successful Senior Data Engineer will bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending More ❯
understand and process changing requirements in a fast-moving environment. Ability to build good, professional relationships, both internally and externally as a representative of Reed In Partnership. Technical skills: Snowflake Scheme Design, Kimball Methodology, Star Schemas Programmes: SnowflakeMore ❯
We are searching for one of our key clients a Senior Data Platform Engineer to support and optimise large scale data ecosystems across PostgreSQL, Snowflake and Greenplum. This is a high-impact role within a modern digital environment, ideal for someone who thrives on complex data challenges and enterprise-grade engineering. Youll be responsible for designing scalable data models … pipelines. Expect to work in a cloud-driven setting with Azure, modern tooling, and critical data platforms that underpin major transformation programmes. Key skills required Deep knowledge of PostgreSQL, Snowflake and Greenplum Snowflake internals, schemas, modelling, data lakes and integration patterns Data ingestion using Informatica, Talend and similar ETL tooling Strong experience handling JSON, XML, CSV and multi … source datasets Patroni expertise for HADR and streaming replication Backup, recovery, tuning and optimisation across Postgres, Snowflake and Greenplum Understanding of Azure environments If youre ready to join a forward-thinking organisation delivering enterprise-level data solutions, please apply with an up-to-date CV that clearly matches the requirement. For more information, reach out via Robert at gazellegc.com More ❯