the rest of the days from home. Key Skills * Hands-on experience working with Teradata and other Datawarehouses. * Deep expertise in Teradata architecture, star schema, snowflakeschema, SQL optimization, and Data Modelling. * Experience in implementation of Teradata utilities (BTEQ, Fast Load, Multiload, TPT etc.) for efficient Data More ❯
in developing and optimising ETL/ELT pipelines and using DBT for data transformation and modelling. Knowledge of data modelling techniques, including star and snowflake schemas, for efficient data analysis. Familiarity with cloud platforms such as AWS or GCP, including services like Databricks, Redshift, BigQuery, and Snowflake. Strong Python More ❯
/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
T-SQL for data extraction and transformation Hands-on experience with Azure technologies (Data Lake, Data Factory, Synapse, etc.) Data modelling experience (star/snowflake schemas, normalization, etc.) Familiarity with ETL processes and working alongside Data Engineers Excellent communication and stakeholder management skills Insurance industry experience (ideally Reinsurance) Desirable More ❯
and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensional modelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications such as DA-100 , DP-500 , or MCSE: BI Familiarity with CI/CD More ❯
improvements for data processing capabilities. Write and optimize SQL queries to ensure data integrity, performance, and scalability. Implement a flexible Data Vault model in Snowflake to support large-scale analytics and BI. Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver data-driven solutions. Engage with stakeholders … solutions. Implement and enforce data governance and quality processes across systems. Support cloud platforms such as AWS/Azure and tools like DBT with Snowflake for scalable data solutions. Continuously seek improvements in data systems, processes, and tools for efficiency and scalability. Key Skills/Experience: Solid understanding of … ETL/ELT processes with hands-on experience using DBT, Snowflake, Python, SQL, Terraform, and Airflow. Experience designing and implementing cloud-based data products and solutions. Proficiency with cloud data warehouses and analytics platforms such as Snowflake, AWS, and Azure. Experience with GitHub for version control and project More ❯
optimize SQL queries, ensuring data integrity, performance, and scalability, using best practices and techniques * Data vault Model implementation: Implement flexible Data vault model in Snowflake to support large-scale analytics and business intelligence. * Cross-Team Collaboration: Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver solutions that … ensuring accurate and consistent data flows across all systems. * Cloud & Infrastructure Support: Work with cloud platforms such as AWS/Azure and DBT with Snowflake to build and maintain scalable data solutions. * Continuous Improvement: Proactively look for ways to improve data systems, processes, and tools, ensuring efficiency and scalability. … ETL/ELT & Data Pipelines: Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow * Experience in designing and implementing data products and solutions on cloud-based architectures. * Cloud Platforms: Experience working with cloud More ❯
other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball methodology (star schema, snowflake, etc.). Develop high-quality code following DevOps and software engineering best practices, including testing and CI/CD. Monitor and maintain More ❯
High Wycombe, Buckinghamshire, United Kingdom Hybrid / WFH Options
DataCo GmbH
engineer; working with teams across the group to present their data in ways that make it useful, accurate and timely. We model data in Snowflake using dbt, making use of Power BI for most (but not all) of our presentation. Code is managed in GitHub using a continuous integration More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance … ll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance … ll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of More ❯
remote working And more WHAT WILL YOU BE DOING? We are looking for a skilled data professional to design and optimise ETL pipelines in Snowflake and Azure Data Factory, ensuring seamless data integration and transformation. This role involves building and managing semantic data models in Snowflake and Power … BI to support scalable, user-friendly analytics and reporting. You will also develop Snowflake stored procedures using Python to automate workflows and handle complex data transformations. Maintaining data integrity and accessibility within Snowflake will be essential for effective data warehousing operations. Additionally, you will collaborate closely with analytics … consent for us to process and submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Snowflake, Azure Data Factory, PowerBI, Python, SQL, ETL More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Santander UK
members, stakeholders and end users conveying technical concepts in a comprehensible manner Skills across the following data competencies: SQL (AWS Athena/Hive/Snowflake) Hadoop/EMR/Spark/Scala Data structures (tables, views, stored procedures) Data Modelling - star/snowflake Schemas, efficient storage, normalisation Data More ❯
level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … give the team access to new data sources. What you’ll do: Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and build data new models as the More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Omaze UK
level. As Senior Analytics Engineer, you will have sole ownership of analytics engineering at Omaze. You will use industry standard tools and platforms (dbt, Snowflake, ThoughtSpot) to amplify the effectiveness and impact of our (growing) analytics team. You will provide clean, tested, well-documented models, and work with our … give the team access to new data sources. What you’ll do: Fully own our dbt project, building and maintaining data models in our Snowflake data warehouse, blending and modelling data from multiple sources Work with analysts and engineers to collaboratively design and build data new models as the More ❯
like Azure Data Factory , Azure Data Lake , or Azure Synapse . Understanding of Azure’s security and identity management practices (e.g., IAM, RBAC). Snowflake Data Warehouse Experience : Designing and optimizing Snowflake schemas for efficient querying. Implementing ETL/ELT pipelines to load and transform data in Snowflake. More ❯
data and improve every decision we make at EFG. This is a unique opportunity to shape our data foundations, optimize our dbt models on Snowflake, and ensure our analysts and business users have access to clean, well-structured, high-performance datasets. Own Your Domain Ask all the whys, relentlessly … data stack - whether that is traditional BI, powering ML models or providing enriched data to operational systems via reverse ETL. Optimize dbt models on Snowflake, ensuring efficiency, reliability, and maintainability. Enforce best practices in dimensional modeling (star/snowflake schemas), normalization, and performance tuning. Write clean, modular SQL … write highly performant queries. You have multiple years of experience with dbt and have optimized models at scale. You have a deep understanding of Snowflake architecture, including performance tuning, clustering, caching, and cost optimization. You have created data sources for highly performant BI and know how best to optimize More ❯
across Pricing, Finance, and Healthcare Management. We are seeking a talented Data Scientist to join our team and utilise our new data platform on Snowflake to drive model insights and forecasting for BUPA. Alongside supporting the team with data scientist techniques. How you'll help us make health happen … challenging existing processes and driving innovation. Experienced in monitoring model performance and ensuring reliability. Experience or understanding of the Insurance industry. Prior use of Snowflake Actuarial Concepts: Knowledge of key actuarial principles. Transformation Experience: Background in environments undergoing data platform transformation. Our benefits are designed to make health happen … across Pricing, Finance, and Healthcare Management. We are seeking a talented Data Scientist to join our team and utilise our new data platform on Snowflake to drive model insights and forecasting for BUPA. Alongside supporting the team with data scientist techniques. How you'll help us make health happen More ❯