Bristol, England, United Kingdom Hybrid / WFH Options
Snap Analytics
pipelines handling diverse data sources. You’ll work closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You’ll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or … GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., starschema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools (e.g., Git, Jenkins, Docker, Kubernetes). More ❯
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
ripponden, yorkshire and the humber, united kingdom
JLA Group
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom
JLA Group
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
with GCP Experience with Terraform Strong Python skills Experience with version control for data models (e.g., dbt testing frameworks, data documentation) Demonstrated experience with data modelling concepts (dimensional modelling, star schemas) Experience writing efficient and optimised SQL for large-scale data transformations Understanding of data warehouse design principles and best practices Proactive approach to identifying and resolving data quality More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Capgemini
ability to communicate issues and solutions to a variety of stakeholders including the facilitation of client workshops. An understanding of key data modelling concepts (e.g., fact and dimension tables, star schemas and snowflake schemas, denormalised tables, and views). Experience with data handling, e.g. data querying, data manipulation or data wrangling to transform raw data into the desired format More ❯
pipelines handling diverse data sources. You'll work closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or … GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., starschema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools (e.g., Git, Jenkins, Docker, Kubernetes). More ❯
lost revenue , 4 billion pounds of textile waste , and 10% of global carbon emissions each year. Rather than adjusting sizing after the fact, we prevent bad fit from the star t. Using machine learning and generative AI, we simulate and predict how garments will perform on real bodies, eliminating poor fit before a single item is made. It’s … when to use snapshots vs incremental , can use seeds for controlled lookup data, and enjoy keeping models modular and maintainable. Modelling Expertise : Comfortable working across normalized and analytics-ready star schemas Deep understanding of SCD types, particularly Type 2, using dbt snapshots Experience working with multi-tenant datasets and aligning models across clients/domains Strong discipline around testing More ❯
lost revenue , 4 billion pounds of textile waste , and 10% of global carbon emissions each year. Rather than adjusting sizing after the fact, we prevent bad fit from the star t. Using machine learning and generative AI, we simulate and predict how garments will perform on real bodies, eliminating poor fit before a single item is made. It’s … when to use snapshots vs incremental, can use seeds for controlled lookup data, and enjoy keeping models modular and maintainable. Modelling Expertise : Comfortable working across normalized and analytics-ready star schemas Deep understanding of SCD types, particularly Type 2, using dbt snapshots Experience working with multi-tenant datasets and aligning models across clients/domains Strong discipline around testing More ❯
lost revenue , 4 billion pounds of textile waste , and 10% of global carbon emissions each year. Rather than adjusting sizing after the fact, we prevent bad fit from the star t. Using machine learning and generative AI, we simulate and predict how garments will perform on real bodies, eliminating poor fit before a single item is made. It’s … when to use snapshots vs incremental, can use seeds for controlled lookup data, and enjoy keeping models modular and maintainable. Modelling Expertise : Comfortable working across normalized and analytics-ready star schemas Deep understanding of SCD types, particularly Type 2, using dbt snapshots Experience working with multi-tenant datasets and aligning models across clients/domains Strong discipline around testing More ❯
lost revenue , 4 billion pounds of textile waste , and 10% of global carbon emissions each year. Rather than adjusting sizing after the fact, we prevent bad fit from the star t. Using machine learning and generative AI, we simulate and predict how garments will perform on real bodies, eliminating poor fit before a single item is made. It’s … when to use snapshots vs incremental, can use seeds for controlled lookup data, and enjoy keeping models modular and maintainable. Modelling Expertise : Comfortable working across normalized and analytics-ready star schemas Deep understanding of SCD types, particularly Type 2, using dbt snapshots Experience working with multi-tenant datasets and aligning models across clients/domains Strong discipline around testing More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Hands-on experience in data solution architecture, design, and deployment Experience with business intelligence tools, data modeling, staging, extraction, data warehouse, and cloud infrastructure Knowledge of multi-dimensional design, star schemas, facts, and dimensions Proficiency in ETL development techniques Experience optimizing data warehouse performance Experience across various industry sectors is a plus Understanding of data management best practices, including More ❯
business analyst role, including data warehousing and business intelligence tools, techniques and technology - Data engineering credentials; working knowledge of a programming language (JAVA/PYTHON/SCALA) - Experience with Star Schemas, Dimensional Models, Datamarts in Traditional Data Warehouses as well as in Big Data/Advanced Analytics domains - Advanced analytical skills and is detail-oriented, while also seeing and More ❯
with data security, governance, and regulatory requirements. Develop monitoring and alerting solutions for proactive data pipeline maintenance and incident prevention. Own the technical delivery of our Lakehouse following a StarSchema approach. Stakeholder Collaboration & Business Impact Work closely with business stakeholders, including Product and Data Analysts to deliver data solutions that drive business value. Translate business requirements into … growth and sustainability. Technical Expertise: Extensive experience designing and implementing scalable, metadata-driven data solutions, optimised for analytical consumption and operational robustness. Deep expertise in data modelling, specifically using starschema methodology, and building performant dimensional models to support high-velocity datasets. Strong experience with Google Cloud Platform (GCP), including BigQuery, Dataflow, Composer (Apache Airflow), Pub/Sub More ❯
London, England, United Kingdom Hybrid / WFH Options
Skipton Building Society
in a collaborative, trusting environment that values diversity of thought and creative solutions. What Do We Need From You? Experience developing Azure Data solutions Knowledge of data modelling principles (starschema, snowflake, data vault) Experience implementing end-to-end ETL/ELT solutions Maintaining and optimising an Enterprise Data Warehouse Data analysis skills Testing and software release management More ❯
Detail: Keep continuity across reports created by the team while maintaining accuracy and consistency, ensuring insights are reliable and actionable. Who we're looking for Knowledge Designing Kimball/starschema data models. Agile Methodology. Solar and BESS knowledge (Desirable) Qualifications Bachelor's degree in computer science, mathematics, statistics or engineering discipline. Microsoft Power BI Certification. Experience High More ❯
sector is alive with career potential. What Do We Need From You? Experience in the development of Azure Data solutions Knowledge of data modelling principles, including common patterns, e.g. starschema, snowflake or data vault Experience in implementation end-to-end ETL/ELT solutions Experience in m aintaining and optimising an Enterprise Data Warehouse Knowledge of data More ❯
infrastructures SQL Master - Extensive knowledge and Vast experience with SQL Queries Extensive data modelling experience, especially with low-quality data – knowledge of common methods like third normal form and starschema Strong shell scripting abilities Proficiency in using Cloud services for data engineering, storage, and analytics Expert Python developer with a working knowledge of data science Proficient with More ❯