Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
Principal Data Consultant (Snowflake/Matillion) Application Deadline: 21 September 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you'll be at the forefront of our most strategic initiatives. Your role will involve leading client engagements, managing large-scale projects, and … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work … closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and More ❯
is a hands-on role for a data purist who understands the nuances of financial data and has a passion for building robust, scalable, and accurate data products on Snowflake and dbt. Key Responsibilities: Take ownership of data modeling for our centralized Credit Risk Data Mart and Warehouse. Pioneer and implement Data Mesh methodologies, promoting domain-oriented data ownership … to develop robust data products. Design and implement data models using Data Vault 2.0, Star Schema, and SnowflakeSchema with transformation pipelines in dbt . Lead the evaluation and approval of technical designs within a Technical Design Authority (TDA) , ensuring standards are met. Work closely with C-level stakeholders and business leaders to translate complex requirements from … a key leader in a Data Mesh transformation. Experience leading or being a key member of a Technical Design Authority (TDA) . Expertise in building enterprise data platforms on Snowflake . Proven ability to collaborate with and influence C-level stakeholders. NICE TO HAVE: Enterprise data architecture using TOGAF and ArchiMate. Design and establish a federated governance framework, including More ❯
Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience working in agile environments and cross-functional teams. Desirable Experience in insurance or regulated industries. Familiarity with data privacy, GDPR More ❯
with Power BI. · Experience with Azure-based data services (Azure Data Lake, Synapse, Data Factory) and their integration with Power BI. · Knowledge of data modelling techniques including star/snowflakeschema design for BI solutions. · Understanding of DevOps/DataOps principles as applied to Power BI CI/CD and workspace automation. More ❯
data cleaning, wrangling, and working with large datasets across different formats. Comfortable writing complex SQL queries for data extraction and transformation Experience with data modelling principles (e.g., star/snowflake schemas). Experience with dbt is a plus. Strong verbal and written communication skills, with the ability to explain data findings to both technical and non-technical audiences. Experience More ❯
boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets. Our challenge We are seeking an experienced Senior Snowflake Developer candidate will be responsible for designing, developing, and maintaining our data warehouse solutions on the Snowflake platform. The ideal candidate should also be proficient in SQL and … with law, the base salary for this role if filled within Iselin, NJ is $100k - $110k/year & benefits (see below). The Role Responsibilities: Design, develop, and optimize Snowflake-based data warehouse solutions to meet business requirements. Collaborate with stakeholders to gather and analyze data requirements, translating them into technical specifications. Develop and maintain ETL processes using SQL … and SSIS to ensure efficient data integration and migration. Implement data pipelines and workflows in Snowflake to support data analytics and reporting needs. Monitor and tune Snowflake performance to ensure optimal data processing and query execution. Ensure data quality, consistency, and integrity across the data warehouse. Provide technical support and troubleshooting for Snowflake and SSIS-related issues. More ❯
enterprise data management and governance principles. Proven experience delivering Business Intelligence (BI) solutions and dashboards to enable data-driven decisions. Experience designing relational and dimensional data models (e.g. star schema, snowflake, etc.). Proficient in ETL and data warehousing, including handling slowly changing dimensions. Excellent communication and interpersonal skills, with the ability to liaise confidently between technical and More ❯
Job Title: Snowflake Centre of Excellence Lead Location: Central London (Hybrid - 2 to 3 days on site per week) Employment Type: Permanent Salary: up to £120,000 per annum + benefits About the Role: We are working with a prestigious client based in London who are seeking a Snowflake Lead to play a pivotal role in establishing and … scaling their Snowflake capability. This is a unique opportunity for a seasoned Snowflake professional to build a Centre of Excellence from the ground up within a fast-paced, high-impact environment. As the Snowflake CoE Lead, you will be instrumental in shaping the organisation's Snowflake strategy, architecture, and delivery model. You'll bring your deep … technical expertise, leadership experience, and direct engagement with Snowflake to build a best-in-class data platform offering. Key Responsibilities: Lead the design, setup, and growth of a Snowflake practice, including establishing a Centre of Excellence. Architect, implement, and maintain scalable data solutions using Snowflake. Collaborate closely with stakeholders across the organisation and with Snowflake directly to More ❯
data models for visualization. • Build and maintain ETL pipelines using Alteryx and Azure Data Factory to automate data ingestion, transformation, and delivery. • Optimize Power BI data models (star/snowflake schemas), implement row-level security, and create custom DAX measures and KPIs. • Translate complex system requirements and data flows into actionable reports and dashboards for end users. • Collaborate with More ❯
practices, system development life cycle management, IT services management, agile and lean methodologies, infrastructure and operations, and EA and ITIL frameworks. Proficiency with data warehousing solutions (e.g., Google BigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP PowerDesigner, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms More ❯
and oral communication skills Experience of working with AXIOMA Optimization Software, equity data sets like BARRA/ITG and understanding of the equity data/models is a plus Snowflake/AWS exposure is preferred PIMCO follows a total compensation approach when rewarding employees which includes a base salary and a discretionary bonus. Base salary is the fixed component More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
autonomy, and the chance to shape the future of data in a high-impact environment. What You'll Do Design and build modular, reusable data models using dbt and Snowflake Collaborate with stakeholders to deeply understand use cases and deliver scalable data solutions Define and maintain the single source of truth for core business metrics Contribute to CI/ More ❯
data migration projects and cloud transformation Deep understanding of enterprise architecture frameworks and methodologies Experience with financial risk systems and analytics platforms Strong technical background in database platforms (Oracle, Snowflake) Understanding of cloud architecture principles (AWS preferred) Excellent communication and stakeholder management skills Ability to translate complex technical concepts for non-technical stakeholders PREFERRED CERTIFICATIONS Cloud platform certifications (AWS … DAMA) Project/Program Management certifications (PMP, Prince2, Agile/Scrum) AI/ML certifications or specialized training TECHNICAL SKILLS Enterprise systems architecture AWS and cloud technologies Oracle and Snowflake platforms Data modeling and design Data lineage and governance tools ETL/ELT processes Big data and analytics platforms Financial risk systems and methodologies Programming knowledge (Python, SQL, etc. … DevOps and CI/CD principles Knowledge graph and semantic web technologies Machine learning operations (MLOps) AI-focused data catalog tools Schema registry and metadata management systems Feature store architecture for machine learning PIMCO follows a total compensation approach when rewarding employees which includes a base salary and a discretionary bonus. Base salary is the fixed component of compensation More ❯
data migration projects and cloud transformation Deep understanding of enterprise architecture frameworks and methodologies Experience with financial risk systems and analytics platforms Strong technical background in database platforms (Oracle, Snowflake) Understanding of cloud architecture principles (AWS preferred) Excellent communication and stakeholder management skills Ability to translate complex technical concepts for non-technical stakeholders PREFERRED CERTIFICATIONS Cloud platform certifications (AWS … DAMA) Project/Program Management certifications (PMP, Prince2, Agile/Scrum) AI/ML certifications or specialized training TECHNICAL SKILLS Enterprise systems architecture AWS and cloud technologies Oracle and Snowflake platforms Data modeling and design Data lineage and governance tools ETL/ELT processes Big data and analytics platforms Financial risk systems and methodologies Programming knowledge (Python, SQL, etc. … DevOps and CI/CD principles Knowledge graph and semantic web technologies Machine learning operations (MLOps) AI-focused data catalog tools Schema registry and metadata management systems Feature store architecture for machine learning PIMCO follows a total compensation approach when rewarding employees which includes a base salary and a discretionary bonus. Base salary is the fixed component of compensation More ❯
are seeking an experienced Lead Analytical Engineer to join a dynamic and rapidly growing organisation based in London. The successful candidate will possess strong expertise in data modelling using Snowflake, dbt, and Python, coupled with excellent stakeholder management skills. In this role, you will lead the development and optimisation of data infrastructure, collaborating closely with cross-functional teams to More ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with Data Governance teams to establish data lineage More ❯
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform)(Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS)Large-scale data environmentUp to £70,000 plus benefitsFULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment? Do … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC), data contracts, and data cataloguing. API & Integration Fluency: Building data ingestion More ❯
function and work closely with senior stakeholders, including programme sponsors and business leads. Key responsibilities - Platform Engineer Support the build and enhancement of a cloud-based data platform using Snowflake on Azure, working within an established technical framework. Develop infrastructure components and manage deployment through infrastructure-as-code practices. Collaborate with internal stakeholders to ensure the data platform design … assurance processes. Contribute to platform documentation and technical standards. There is potential for future responsibility in mentoring or overseeing junior team members. Required experience Hands-on experience working with Snowflake, including schema development and data pipeline integration. Familiarity with Azure services, including resource provisioning and identity setup. Experience using Terraform for infrastructure deployment (ideally with cloud data services More ❯
a unified, scalable architecture to enable improved analytics, data governance, and operational insights. Technical requirements: ️ Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (Data Build Tool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity … both enterprise reporting and self-service analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to More ❯