Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Creditsafe
security, and best practices for the warehouse. Develop and maintain documentation on data models, processes, and business rules. SKILLS AND QUALIFICATIONS Data Warehousing: Strong knowledge of the Kimball methodology (starschema, fact & dimension tables). Experience in designing and implementing data models for analytical reporting. ETL/ELT & Data Integration: Hands-on experience with ETL tools (e.g., Azure More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake More ❯
Employment Type: Permanent, Part Time, Work From Home
and non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., starschema, snowflake schema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor's or … Analysis Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., starschema, snowflake schema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts; experience gathering More ❯
with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (StarSchema or Snowflake Schema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. More ❯
etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including StarSchema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux More ❯
etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including StarSchema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux More ❯
etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including StarSchema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux More ❯
in Microsoft Azure (e.g. Data Factory, Synapse, Azure SQL, Data Lake) Strong proficiency in Power BI , including DAX, data modelling, and deployment Experience designing and implementing data warehouses and starschema models Background in building scalable data pipelines and APIs Excellent communication skills and ability to work with both technical and non-technical stakeholders Previous experience in local More ❯
Azure Databricks, Azure Synapse Analytics, and other Azure data services. Familiarity with Azure Blob Storage, Azure Data Lake, and data lake architectures. Experience working with data modelling, normalization, and starschema design for data warehouses. Proficient in scripting languages such as Python, Shell, or PowerShell for automation tasks. Knowledge of CI/CD practices and tools for data More ❯
fully reconciled Facts and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Starschema and dimensional modelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with More ❯
fully reconciled Facts and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Starschema and dimensional modelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with More ❯
pipelines handling diverse data sources. You'll work closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or … GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., starschema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools (e.g., Git, Jenkins, Docker, Kubernetes). More ❯
for recurring tasks in line with documentation standards. Essential Skills & Experience Proficient in working with UK mortgage data, including portfolio onboarding, new lending, and securitisations. Experience with Kimball-based starschema data warehouses. Strong capabilities in reporting and data visualisation using: Microsoft SQL Server T-SQL Power BI SSRS Microsoft Excel Azure DevOps Performance tuning techniques Solid understanding More ❯
lost revenue , 4 billion pounds of textile waste , and 10% of global carbon emissions each year. Rather than adjusting sizing after the fact, we prevent bad fit from the star t. Using machine learning and generative AI, we simulate and predict how garments will perform on real bodies, eliminating poor fit before a single item is made. It’s … when to use snapshots vs incremental , can use seeds for controlled lookup data, and enjoy keeping models modular and maintainable. Modelling Expertise : Comfortable working across normalized and analytics-ready star schemas Deep understanding of SCD types, particularly Type 2, using dbt snapshots Experience working with multi-tenant datasets and aligning models across clients/domains Strong discipline around testing More ❯
lost revenue , 4 billion pounds of textile waste , and 10% of global carbon emissions each year. Rather than adjusting sizing after the fact, we prevent bad fit from the star t. Using machine learning and generative AI, we simulate and predict how garments will perform on real bodies, eliminating poor fit before a single item is made. It’s … when to use snapshots vs incremental, can use seeds for controlled lookup data, and enjoy keeping models modular and maintainable. Modelling Expertise : Comfortable working across normalized and analytics-ready star schemas Deep understanding of SCD types, particularly Type 2, using dbt snapshots Experience working with multi-tenant datasets and aligning models across clients/domains Strong discipline around testing More ❯
design and rollout. Hands on experience with business intelligence tools, data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETL development techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage More ❯
integrity, and security controls. Define data ownership and stewardship roles across data sets. Conduct regular data audits and drive continuous improvement initiatives. Establish data model management in line with starschema principles. Organizing data into fact and dimension tables. Work with key business stakeholders to define new data sets for incorporation into the data platform. Work with the … updated data sets to ensure they align to the required data model, data governance and data quality requirements Ensure that data from various sources is accurately integrated into the star schema. This includes designing ETL (Extract, Transform, Load) processes to move data into the fact and dimension tables. Ensuring that the data model is optimized for performance, working with … the Development team to ensure any performance issues are resolved. Document the schema design, data sources, ETL processes, and any changes made to the schema. Provide training to team members on how to use and maintain the star schema. Work closely with stakeholders, including data analysts, data scientists, developers and IT teams, to understand their data needs and More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
and comfortable working across both technical and business domains. ✅ Key technical skills: Strong SQL and ELT/data pipeline development experience Expertise in Data Warehouse & Data Lake design (including StarSchema, Snowflake Schema, Data Vault) Hands-on experience with enterprise databases: Oracle, Snowflake, Teradata, or SQL Server Solid understanding of AWS (S3, Lambda, IAM, etc.) Proficiency in More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
and comfortable working across both technical and business domains. ✅ Key technical skills: Strong SQL and ELT/data pipeline development experience Expertise in Data Warehouse & Data Lake design (including StarSchema, Snowflake Schema, Data Vault) Hands-on experience with enterprise databases: Oracle, Snowflake, Teradata, or SQL Server Solid understanding of AWS (S3, Lambda, IAM, etc.) Proficiency in More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
TECHOHANA
AI-ready data across commercial, operational, and network functions. Key Responsibilities Build and maintain robust ETL/ELT pipelines across multiple data sources Design data models using Kimball/starschema Implement tooling for data quality checks, lineage, and observability (e.g., dbt tests, Great Expectations, Azure Purview) Work closely with DevOps teams to embed CI/CD and More ❯
databases, REST APIs, Kafka streams and other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball methodology (starschema, snowflake, etc.). Develop high-quality code following DevOps and software engineering best practices, including testing and CI/CD. Monitor and maintain business-critical pipelines, reacting More ❯
Detail: Keep continuity across reports created by the team while maintaining accuracy and consistency, ensuring insights are reliable and actionable. Who we're looking for Knowledge Designing Kimball/starschema data models. Agile Methodology. Solar and BESS knowledge (Desirable) Qualifications Bachelor's degree in computer science, mathematics, statistics or engineering discipline. Microsoft Power BI Certification. Experience High More ❯
with data security, governance, and regulatory requirements. Develop monitoring and alerting solutions for proactive data pipeline maintenance and incident prevention. Own the technical delivery of our Lakehouse following a StarSchema approach. Stakeholder Collaboration & Business Impact Work closely with business stakeholders, including Product and Data Analysts to deliver data solutions that drive business value. Translate business requirements into … growth and sustainability. Technical Expertise: Extensive experience designing and implementing scalable, metadata-driven data solutions, optimised for analytical consumption and operational robustness. Deep expertise in data modelling, specifically using starschema methodology, and building performant dimensional models to support high-velocity datasets. Strong experience with Google Cloud Platform (GCP), including BigQuery, Dataflow, Composer (Apache Airflow), Pub/Sub More ❯
solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable data pipelines using Microsoft Fabric, PySpark, and T-SQL Lead the development of StarSchema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a More ❯