Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work … closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and … experience designing data architectures on platforms like AWS, Azure, or GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., star schema, snowflakeschema) and optimising models for analytics and reporting. Familiarity with version control, CI/ More ❯
Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience working in agile environments and cross-functional teams. Desirable Experience in insurance or regulated industries. Familiarity with data privacy, GDPR More ❯
with Power BI. · Experience with Azure-based data services (Azure Data Lake, Synapse, Data Factory) and their integration with Power BI. · Knowledge of data modelling techniques including star/snowflakeschema design for BI solutions. · Understanding of DevOps/DataOps principles as applied to Power BI CI/CD and workspace automation. More ❯
data cleaning, wrangling, and working with large datasets across different formats. Comfortable writing complex SQL queries for data extraction and transformation Experience with data modelling principles (e.g., star/snowflake schemas). Experience with dbt is a plus. Strong verbal and written communication skills, with the ability to explain data findings to both technical and non-technical audiences. Experience More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
EXPRESS SOLICITORS
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
Sharston, Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
in Power BI. Youll have a working knowledge of a broad stack of cloud technologies, excellent engineering skills, and experience of Ingestion and ETL methods for pushing data into Snowflake from internal (MuleSoft, Salesforce, SaaS platforms etc) and external API-sources. As a highly capable data engineer, you will be responsible for building dynamic Power BI solutions, leveraging the … best of Snowflake for Analytics and Insight, shaping data modelling and informing Data governance. Youll collaborate with internal and external stakeholder groups to develop complex, and at times bespoke, data solutions for a logistics-intensive business. Our work involves moving thousands of valuable artworks across the globe, hosting several public and private exhibitions and participating in art fairs all … data tables, security/access frameworks, and analytics delivery. Strong attention to detail. Strong experience with data transformation tools such as DBT, or Dataflow. Strong hands-on knowledge of Snowflake and PostgreSQL databases. Strong Ability to work autonomously in a fast-paced and dynamic environment. Strong SQL and Python coding skills, particularly in automation & integrations. Sound experience in AWS More ❯
Northampton, Northamptonshire, United Kingdom Hybrid / WFH Options
Experis
timely manner. Document processes, workflows, and technical specifications clearly and effectively. Required Skills & Experience: Proven hands-on experience with: Databricks (Spark, Delta Lake, notebooks) DBT (data modeling, transformations, testing) Snowflake (SQL, performance tuning, data warehousing) Strong understanding of data engineering principles and best practices. Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders. Experience More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
Job Title: Snowflake Centre of Excellence Lead Location: Central London (Hybrid - 2 to 3 days on site per week) Employment Type: Permanent Salary: up to £120,000 per annum + benefits About the Role: We are working with a prestigious client based in London who are seeking a Snowflake Leader to play a pivotal role in establishing and … scaling their Snowflake capability. This is a unique opportunity for a seasoned Snowflake professional to build a Centre of Excellence from the ground up within a fast-paced, high-impact environment. As the Snowflake CoE Lead, you will be instrumental in shaping the organisation's Snowflake strategy, architecture, and delivery model. You'll bring your deep … technical expertise, leadership experience, and direct engagement with Snowflake to build a best-in-class data platform offering. Key Responsibilities: Lead the design, setup, and growth of a Snowflake practice, including establishing a Centre of Excellence. Architect, implement, and maintain scalable data solutions using Snowflake. Collaborate closely with stakeholders across the organisation and with Snowflake directly to More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
autonomy, and the chance to shape the future of data in a high-impact environment. What You'll Do Design and build modular, reusable data models using dbt and Snowflake Collaborate with stakeholders to deeply understand use cases and deliver scalable data solutions Define and maintain the single source of truth for core business metrics Contribute to CI/ More ❯
improving a system that handles a high volume of traffic so experience with cloud and data warehouse infrastructure will help you in this role (we use AWS, Cloudflare and snowflake). Familiarity with infrastructure as code will also help when updating our cloud architecture (we use terraform). We place a large focus on data quality so you'll … not just the code, but the architecture of our platforms and everything that enables the business to thrive. Gain expertise over our tools and services: Python, Docker, Github Actions, Snowflake, AWS Participate in all team ceremonies and have direct input in the team's ways of working. This is a high-trust, supportive, and collaborative environment where you will … experience-we want to hire people to grow into the role and beyond. About the team: Python is our bread and butter. The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business. We use Docker and Kubernetes to manage our production services. We use Github Actions More ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with Data Governance teams to establish data lineage More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
function and work closely with senior stakeholders, including programme sponsors and business leads. Key responsibilities - Platform Engineer Support the build and enhancement of a cloud-based data platform using Snowflake on Azure, working within an established technical framework. Develop infrastructure components and manage deployment through infrastructure-as-code practices. Collaborate with internal stakeholders to ensure the data platform design … assurance processes. Contribute to platform documentation and technical standards. There is potential for future responsibility in mentoring or overseeing junior team members. Required experience Hands-on experience working with Snowflake, including schema development and data pipeline integration. Familiarity with Azure services, including resource provisioning and identity setup. Experience using Terraform for infrastructure deployment (ideally with cloud data services More ❯
ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC), data contracts, and data cataloguing. API & Integration Fluency: Building data ingestion More ❯
Senior Engineer to join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines … deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test automation. Familiarity with AI-powered development tools More ❯
a unified, scalable architecture to enable improved analytics, data governance, and operational insights. Technical requirements: ️ Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (Data Build Tool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity … both enterprise reporting and self-service analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to More ❯
Role: Snowflake Data Architect Location: Hove, UK Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, Logical and Physical Data Models in Snowflake. Design Data Pipelines and ingestion frameworks using Snowflake native tools. Work with Data Governance teams to establish Data lineage, Data quality More ❯
while also helping define platform standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE … A successful Senior Data Engineer will bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending More ❯
the role is fully remote. For location-specific details, please connect with our recruiting team. What You Will Do: Model usage based economics - Build and maintain dbt models in Snowflake that translate raw product telemetry into ARR, margin, and unit economics views. Forecast revenue & costs - Design Python driven Monte Carlo and statistical models to project usage, revenue, and gross … testing, reconciliation, and data quality SLAs so Finance can audit every revenue and cost metric back to source. Mentor analysts & engineers - Provide best practice guidance on dbt design patterns, Snowflake optimization, and Python analytical tooling. About You: 7+ years in analytics or data science roles focused on Finance (SaaS or usage based preferred). Deep hands on expertise withdbt More ❯
modelled, documented, and served across the business, working with a modern stack and high-impact teams. Key responsibilities in the role Build and maintain robust dbt models within the Snowflake data warehouse Design scalable solutions that translate business questions into reliable datasets Collaborate with teams across product, ops, marketing, and finance Improve data quality, documentation, and platform reliability Integrate … third-party sources Support self-serve analytics through Looker What They’re Looking For Previous experience in an analytics engineering role (or similar) Strong experience with SQL, dbt, and Snowflake Someone just as confident building models as they are shaping best practices Comfortable in ambiguity and proactive about making things better Bonus if you’ve worked in insurance or More ❯