Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
Principal Data Consultant (Snowflake/Matillion) Application Deadline: 21 September 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you'll be at the forefront of our most strategic initiatives. Your role will involve leading client engagements, managing large-scale projects, and … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work … closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and More ❯
5+ years’ experience. The position is very likely to grow into a 'head of' role over time. You will scope, design and implement secure and effective database solutions, in Snowflake and build data models to store and retrieve business data for calculations and BI reporting. This includes implementing low-level security (RLS), access policies, and data masking. Ideal technical … skills: o Experienced knowledge of SQL including writing complex queries, stored procedures, optimizing performance, and working with relational databases like Snowflake or SQL Server o Hands-on experience with Snowflake including data modelling, optimization, and managing cloud-based data warehousing solutions. o Experience with building and maintaining ETL data pipelines using tools Azure Data Factory o Solid experience … APIs, databases). o Hands-on experience with DBT (Data Build Tool) for building, testing, and documenting analytical data models. o Strong understanding of data modelling concepts (star/snowflakeschema, slowly changing dimensions). o Experience integrating DBT with CI/CD pipelines (Azure DevOps) and orchestration tools (ADF) o Hands-on experience in developing interactive dashboards More ❯
enterprise data management and governance principles. Proven experience delivering Business Intelligence (BI) solutions and dashboards to enable data-driven decisions. Experience designing relational and dimensional data models (e.g. star schema, snowflake, etc.). Proficient in ETL and data warehousing, including handling slowly changing dimensions. Excellent communication and interpersonal skills, with the ability to liaise confidently between technical and More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
autonomy, and the chance to shape the future of data in a high-impact environment. What You'll Do Design and build modular, reusable data models using dbt and Snowflake Collaborate with stakeholders to deeply understand use cases and deliver scalable data solutions Define and maintain the single source of truth for core business metrics Contribute to CI/ More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
feeds and related applications Writing, testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance Contributing to technical process improvement … support rota Minimum Criteria You'll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
feeds and related applications Writing, testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance Contributing to technical process improvement … support rota Minimum Criteria You'll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business More ❯
feeds and related applications Writing, testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance Contributing to technical process improvement … support rota Minimum Criteria You'll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
feeds and related applications Writing, testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance Contributing to technical process improvement … support rota Minimum Criteria You'll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business More ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with Data Governance teams to establish data lineage More ❯
Factory/Databricks (PySpark/Scala) to build scalable data processing and transformation workflows for both batch and streaming data. Develop data models and implement data partitioning , indexing , and schema optimization to improve query performance in Azure Synapse Analytics . Data Warehouse & Analytics Solutions : Collaborate with data architects and business stakeholders to design and implement data lake and data … warehouse architectures on Azure. Create and optimize data models (star and snowflake schemas) for business intelligence (BI) and analytics workloads, ensuring high performance and scalability. Data Governance & Quality : Implement data governance practices to ensure data accuracy, completeness, and integrity across the Azure environment. Develop and enforce data quality checks and validation rules to maintain high levels of data consistency … large datasets. Experience with designing and managing data lakes using ADLS , with an understanding of hierarchical namespaces, partitioning strategies, and performance optimization techniques. Expertise in Data Modeling (star and snowflake schemas) and writing efficient SQL queries for analytical workloads. Experience with Azure DevOps for CI/CD pipeline creation and deployment automation for data pipelines and infrastructure. Proficiency with More ❯
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform)(Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS)Large-scale data environmentUp to £70,000 plus benefitsFULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment? Do … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
function and work closely with senior stakeholders, including programme sponsors and business leads. Key responsibilities - Platform Engineer Support the build and enhancement of a cloud-based data platform using Snowflake on Azure, working within an established technical framework. Develop infrastructure components and manage deployment through infrastructure-as-code practices. Collaborate with internal stakeholders to ensure the data platform design … assurance processes. Contribute to platform documentation and technical standards. There is potential for future responsibility in mentoring or overseeing junior team members. Required experience Hands-on experience working with Snowflake, including schema development and data pipeline integration. Familiarity with Azure services, including resource provisioning and identity setup. Experience using Terraform for infrastructure deployment (ideally with cloud data services More ❯
Stratford-upon-avon, Warwickshire, United Kingdom Hybrid / WFH Options
Big Red Recruitment
the data across the organisation. We need a few basics from you: Data pipeline creation Strong ETL development experience with Azure Data Factory Datawarehouse and data storage concepts - star schema/snowflake/Kimball Azure Databricks and AI Power BI reporting skills would be advantageous Strong stakeholder management experience After a few weeks of induction and meeting the More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Design and implement solutions using GCP , with a strong focus on Data Lakehouse Architecture , Master Data Management (MDM) , and Dimensional Data Modeling Work with modern databases and platforms, including Snowflake , Oracle , SQL Server , and PostgreSQL Apply Agile and conventional methodologies to manage development and delivery lifecycles Communicate effectively with stakeholders across the business to ensure alignment and engagement Required … Technical Skills: GCP Data Architecture Data Lakehouse Architecture MDM (Conceptual) Dimensional Data Modeling Snowflake, Oracle, SQL Server, PostgreSQL Python and Power BI (desirable) Knowledge of test automation tools and practices Strong understanding of Agile and software development best practices Ideal Candidate Profile: Extensive experience in delivering data programmes within the Insurance or Reinsurance sector Strong leadership and organisational skills More ❯
Who We Are: We are interactive investor (ii), the UK's number one flat-fee investment platform, here to help our customers take control of their financial future. For a simple, flat monthly fee we provide a secure home for More ❯