Liverpool, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
Partners Data Services Team? Investment in Data: We recognise the power of data and are fully committed to growing a best-in-class team. Cutting-Edge Tech: Work with Snowflake, Power BI, and AI-driven solutions to shape the future of data analytics. People-First Culture: A collaborative, flexible, and supportive work environment. Career Growth Opportunities: Continuous learning and … developing, and maintaining our MS SQL Server Data Warehouses and associated data feeds into and out of the warehouses, and developing on our new modern cloud data platform, requiring Snowflake, dbt and Azure Data Factory experience. Our data platform's support regulatory requirements, business intelligence & reporting needs and numerous system integrations. This role requires strong technical proficiency and a … of data engineering and data warehousing principles and practices. You will be critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. As a Data Engineer More ❯
London, England, United Kingdom Hybrid / WFH Options
Plutus
modelling and SQL/database design skills. Proficiency in ETL/ELT processes. Expert-level SQL and Python programming. Understanding of different data modelling techniques (e.g., Kimball, star, and snowflake schemas). Data quality techniques. Data normalisation. Cloud and Big Data Technologies. Extensive experience with Databricks. Familiarity with cloud data warehouses (AWS, Azure, GCP, or Snowflake). Additional More ❯
London, England, United Kingdom Hybrid / WFH Options
Thehealthylivingstore
play a key part in designing, building, and optimising scalable data pipelines, ensuring high-quality, actionable data is readily available to drive decision-making. Working with modern tools like Snowflake and dbt, you’ll build and maintain our data infrastructure and collaborate with cross-functional teams, including analysts and stakeholders, to support reporting and insights. You’ll also have … organisation. Key Responsibilities Design, develop, and maintain scalable and reliable ETL pipelines to extract, transform, and load data from a variety of sources. Build and optimise data models using Snowflake and dbt to ensure accuracy, integrity, and accessibility across the organisation. Write and optimise complex SQL queries for data transformation, reporting, and analysis. Collaborate with analysts and stakeholders to More ❯
data-oriented, systematic approaches to solving problems. You have 2-3 years of experience in data management and analysis. You are familiar with SQL and Cloud Data Warehouses like Snowflake . Exposure to data models and ETL frameworks , and familiarity with DBT is a plus. You have the ability to translate business needs into meaningful dashboards and reports. You More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
in relational databases (e.g. postgres, sql server) Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of experience in More ❯
in relational databases (e.g. postgres, sql server) Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of experience in More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
join a growing data team at the early stages of its evolution, where you’ll have real ownership, shape foundational data assets, and work with cutting-edge technologies like Snowflake, DBT, Microsoft Azure, and Power BI and next-gen BI platforms. Responsibilities Design and maintain clean, modular data models in Snowflake, applying analytics engineering best practices using DBT. … Proven experience in analytics engineering or advanced BI/data roles with a strong focus on data modelling and transformation Proficiency in SQL and cloud data platforms such as Snowflake or Azure Synapse Analytics Hands-on experience with DBT for developing, testing, and documenting transformations Understanding of modern data stack principles, including layered modelling, modular SQL, and Git-based More ❯
setup, requirements gathering, design, development, testing, deployment. Perform hands-on development with SQL, occasionally Python, create data visualizations in Tableau, Power BI, or similar, and design data models in Snowflake, Databricks. Lead client meetings such as stakeholder interviews and workshops. Translate complex business challenges into actionable plans with clear communication. Identify risks and issues proactively and suggest mitigation strategies. … Agile environment. Experience with client-facing activities like requirements gathering, facilitation, and presentation. 5+ years in a consulting or data-focused role with strong SQL experience on platforms like Snowflake, Databricks, Teradata, Redshift, BigQuery, MS SQL Server. Experience designing relational databases or data warehouses. Hands-on experience with business-driven/self-service BI tools like Tableau, Power BI More ❯
and implementing innovative approaches to address business problems and solutions. Experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven ability to build business knowledge through meaningful partnerships at the individual contributor, leadership, and EMG levels. Demonstrated advanced communication skills … monitoring, and case management systems. Strong analytical and problem-solving skills with attention to detail. Excellent communication and collaboration skills to work with cross-functional teams. Strong expertise in Snowflake, including data modeling, writing advanced SQL, and performance tuning. Solid understanding of cloud data architecture and modern data stack concepts. Compensation range: The salary range for this position is More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Capgemini
issues and solutions to a variety of stakeholders including the facilitation of client workshops. An understanding of key data modelling concepts (e.g., fact and dimension tables, star schemas and snowflake schemas, denormalised tables, and views). Experience with data handling, e.g. data querying, data manipulation or data wrangling to transform raw data into the desired format for analytics and More ❯
with other engineers on the team to elevate technology and consistently apply best practices. DBT Experience is must required: DBT macros and overwriting DBT Custom Schemas DBT Materialization details Snowflake Cluster Keys Snowflake SQL functions Required Skill and Experience: A minimum of 7+ years of IT experience in data engineering or data management field 4 - 5 years of … experience in application development using SQL, PLSQL Strong working knowledge and min of 2 years' experience in DBT, Snowflake Hands-on experience with Airflow is highly recommended Strong working experience with Python, python pandas. Recent experience as a Senior Data Engineer in any public cloud, preferably on Azure as well as on a cloud warehouse like Snowflake is … or in a similar role. Strong proficiency in Data Modelling, process metadata, observability, and monitoring data platforms. Familiarity with data modelling tools/techniques is a plus Azure or Snowflake training/certification is a plus Strong analytical, problem solving, communication, collaborative and organizational skills Self-driven, self-directed, passionate analytical and focused on delivering the right results More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations Ltd
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. About You We are looking for a skilled Lead Data Engineer who: Personal Attributes Self-starter with initiative and enthusiasm Thrives at solving problems with … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
as Power BI, Tableau, QlikView/Qlik Sense, or similar. Strong SQL skills; experience with databases such as SQL Server, Oracle, or PostgreSQL. Knowledge of data modelling, star/snowflakeschema, and DAX/MDX (if applicable). Familiarity with ETL processes and tools (e.g., SSIS, Talend, Azure Data Factory). Strong problem-solving skills and attention to More ❯
London, England, United Kingdom Hybrid / WFH Options
Native Instruments
If you're looking to contribute, grow, and work on diverse data challenges, join us! Your Contribution Design and develop efficient and scalable data models and data marts (e.g.star schema, snowflakeschema) using best practices for data warehousing and business intelligence that are optimized for self-service analytics tools Collaborate with business stakeholders (e.g. finance, marketing, operations More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Hybrid 2-3 days onsite in a week) Duration: Long Term B2B Contract Job Description: The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines using various resources. Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using … processing. Leverage Python to create automation scripts and optimize data processing tasks. Proficiency in SQL performance tuning and query optimization techniques using Snowflake. Troubleshoot and optimize DBT models and Snowflake performance. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow. Strong analytical and problem-solving skills with the ability to work … reliability, and consistency across different environments. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions. Certification in AWS, Snowflake, or DBT is a plus. #J-18808-Ljbffr More ❯
Direct message the job poster from KBC Technologies Group The ideal candidate will have a minimum of 5+ years of experience with strong expertise in Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines. Proficiency in Snowflake data warehouse architecture and the ability to design, build, and optimize ETL/ELT pipelines using DBT (Data Build … processing, leveraging Python to create automation scripts and optimize data processing tasks. Proficiency in SQL performance tuning and query optimization techniques using Snowflake. Troubleshooting and optimizing DBT models and Snowflake performance. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow. Strong analytical and problem-solving skills, with the ability to work … reliability, and consistency across different environments. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions. Certification in AWS, Snowflake, or DBT is a plus. Seniority Level Mid-Senior level Employment Type Contract Job Function Information Technology Industries Pharmaceutical Manufacturing and Biotechnology Research #J-18808-Ljbffr More ❯
Warrington, Cheshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
client are a start-up data consultancy looking to expand their data engineering practice to facilitate the service of their growing client base. This role will require proficiency in Snowflake, Python, DBT, AWS, and SQL. Consultancy experience would be a huge plus. Salary and Benefits Competitive salary of £55k - £65k (DOE) Fully remote working 25 days annual leave And … many more! Role and Responsibilities Design and deliver data pipelines using SQL, dbt, and Python within Snowflake to transform and model data for a variety of use cases. Build robust, testable, and maintainable code that integrates data from diverse sources and formats. Work closely with analytics, product, and client teams to turn data requirements into scalable solutions. Support the … standards, development processes, and client delivery playbooks. Participate in sprint planning, reviews, and retrospectives with a lean agile delivery mindset. What do I need to apply Strong experience with Snowflake Experience with Python, DBT, AWS, SQL. Strong stakeholder engagement. Experience with integrating API's Data science background or some exposure. My client have limited interview slots and they are More ❯
site, working closely with engineers, analysts, and business stakeholders. ? About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting … and visualisation ? What You’ll Do: Design and implement scalable, well-documented data models in Snowflake using dbt Build curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and … dimensional modelling, layered architecture, and data quality What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across More ❯
site, working closely with engineers, analysts, and business stakeholders. 🚀 About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting … and visualisation 🧠 What You’ll Do: Design and implement scalable, well-documented data models in Snowflake using dbt Build curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and … dimensional modelling, layered architecture, and data quality ✅ What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across More ❯
site, working closely with engineers, analysts, and business stakeholders. About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting … and visualisation What You’ll Do: Design and implement scalable, well-documented data models in Snowflake using dbt Build curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and … dimensional modelling, layered architecture, and data quality What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across More ❯
Writing efficient and complex DAX formulas for calculated columns, measures, and KPIs. Power Query (M Language) : Data transformation and shaping using Power Query Editor. Data Modeling : Designing star/snowflake schemas, relationships, and optimizing data models for performance. Data Visualisation Best Practices : Ability to design intuitive and impactful dashboards. KPI Development : Creating and tracking key performance indicators aligned with More ❯