Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … design and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (StarSchema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
pipelines handling diverse data sources. You'll work closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or … GCP. Technical Skills Extensive experience with ETL/ELT tools (e.g. Matillion, Informatica, Talend) and cloud data platforms (e.g., Snowflake, Databricks, BigQuery). Expertise in data modelling techniques (e.g., starschema, snowflake schema) and optimising models for analytics and reporting. Familiarity with version control, CI/CD pipelines, and containerisation tools (e.g., Git, Jenkins, Docker, Kubernetes). More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
the wider data culture. Key Responsibilities: Design and deliver robust data pipelines using Databricks, Azure Data Factory, and Azure SQL. Build and enhance enterprise data warehouse models (facts, dimensions, star schemas). Collaborate with BI and Analytics teams to translate requirements into engineering solutions. Drive adoption of modern DevOps practices (CI/CD, cost optimisation, automation). Advocate for … solutions. Essential Skills: 5 years' experience in Data Engineering. Strong expertise in Databricks, Azure Data Factory, Azure SQL, and Azure Synapse/DW. Solid understanding of dimensional modelling (Kimball, StarSchema) and EDW solutions. Experience working with structured and unstructured data. Familiarity with cloud and DevOps practices - ie Azure, CI/CD pipelines, scaling, cost optimisation. Strong problem More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
the wider data culture. Key Responsibilities: Design and deliver robust data pipelines using Databricks, Azure Data Factory, and Azure SQL. Build and enhance enterprise data warehouse models (facts, dimensions, star schemas). Collaborate with BI and Analytics teams to translate requirements into engineering solutions. Drive adoption of modern DevOps practices (CI/CD, cost optimisation, automation). Advocate for … solutions. Essential Skills: 5+ years' experience in Data Engineering. Strong expertise in Databricks, Azure Data Factory, Azure SQL, and Azure Synapse/DW. Solid understanding of dimensional modelling (Kimball, StarSchema) and EDW solutions. Experience working with structured and unstructured data. Familiarity with cloud and DevOps practices - ie Azure, CI/CD pipelines, scaling, cost optimisation. Strong problem More ❯
fully reconciled Facts and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Starschema and dimensional modelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with More ❯
AWS CDs Proficiency in ETL/ELT processes and best practices Experience with data visualization tools (Quicksight) Strong analytical and problem-solving abilities Excellent understanding of dimensional modeling and starschema design (Facts, dimensions, scd type 2) Experience with agile development methodologies Strong communication skills and ability to work with cross-functional teams Background in data governance and More ❯
of enterprise data management and governance principles. Proven experience delivering Business Intelligence (BI) solutions and dashboards to enable data-driven decisions. Experience designing relational and dimensional data models (e.g. starschema, snowflake, etc.). Proficient in ETL and data warehousing, including handling slowly changing dimensions. Excellent communication and interpersonal skills, with the ability to liaise confidently between technical More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
pipelines using modern, cloud-based tools and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. starschema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure Experience using version control More ❯
implementing enterprise-scale data architecture in complex environments Deep understanding of data modelling techniques-including conceptual, logical, and physical modelling Strong expertise in Kimball methodology and dimensional modelling (e.g. starschema design) Experience with modern cloud data platforms, ideally including Microsoft Azure, Databricks, and associated tools (e.g., Azure Data Factory, Azure SQL, Synapse) Familiarity with modern data engineering More ❯
implementing enterprise-scale data architecture in complex environments Deep understanding of data modelling techniques-including conceptual, logical, and physical modelling Strong expertise in Kimball methodology and dimensional modelling (e.g. starschema design) Experience with modern cloud data platforms, ideally including Microsoft Azure, Databricks, and associated tools (e.g., Azure Data Factory, Azure SQL, Synapse) Familiarity with modern data engineering More ❯
Rochdale, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Footasylum Ltd
You Experience with finance/financial systems and concepts Azure Databricks Azure Data Factory Excellent SQL skills Good Python/Spark/pyspark skills Experience of Kimball Methodology and star schemas (dimensional model). Experience of working with enterprise data warehouse solutions. Experience of working with structured and unstructured data Experience of a retail environment preferred A good understanding More ❯
metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain Knowledge : Experience with financial systems and reporting Data Modelling : Kimball methodology, star schemas Retail Experience : Preferred but not essential About the Team We're a collaborative data function made up of BI Developers and Data Engineers. We work end-to-end More ❯
metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain Knowledge : Experience with financial systems and reporting Data Modelling : Kimball methodology, star schemas Retail Experience : Preferred but not essential About the Team We're a collaborative data function made up of BI Developers and Data Engineers. We work end-to-end More ❯
/ETL pipelines that ingest data from network devices (IPDR, SNMP, OSS/BSS), CRM, finance, and IoT platforms into our BI warehousing environment. • Model data using Kimball/star schemas and data vault principles to support BI and self service analytics. • Implement data quality, lineage, and observability tooling (e.g., dbt tests, Great Expectations, Azure Purview). • Optimise storage More ❯
solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable data pipelines using Microsoft Fabric, PySpark, and T-SQL Lead the development of StarSchema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a More ❯
solutions that power advanced analytics and business intelligence. What You'll Do: Architect and build scalable data pipelines using Microsoft Fabric, PySpark, and T-SQL Lead the development of StarSchema Lakehouse tables to support BI and self-service analytics Collaborate with stakeholders to translate business needs into data models and solutions Mentor engineers and act as a More ❯
Senior Data Engineer Strong hands-on experience building and maintaining automated data pipelines Advanced proficiency in Python and SQL Experience with dbt and strong understanding of data modelling (e.g., starschema) Proficient with orchestration tools such as Airflow Comfortable working with AWS services (Glue, S3, etc.) or similar cloud platforms Experience with Git, GitHub Actions and CI/ More ❯
PE & PFE. · Experience working with large scale, multi-terabyte data warehouses including performance tuning, query optimization and execution plan analysis · Advanced knowledge of data warehousing principles, dimensional modelling and starschema design · Hands-on experience with SQL Server and Snowflake, including their architecture, features and best practices · Familiarity with data integration tools (SSIS, ADF) and techniques (ELT, ETL More ❯
or similar Experience with modern cloud data platforms such as BigQuery, Fabric, Snowflake Strong grasp of ETL/ELT principles, patterns and best practices Understanding of data modelling, Kimball Starschema principles and how to map business processes to data architecture Bonus points for proficiency in Python , particularly for data wrangling with pandas, automating workflows, or developing custom More ❯
development experience in a client-facing or consultancy setting Strong command of DAX (CALCULATE, SUMX, VAR, FILTER, REMOVEFILTERS, time intelligence) Proficiency in SQL and solid understanding of tabular/starschema models Experience working with cloud platforms, ETL tools, and performance tuning Confident leading sessions with both technical and non-technical stakeholders Why Apply his is a key More ❯
development experience in a client-facing or consultancy setting Strong command of DAX (CALCULATE, SUMX, VAR, FILTER, REMOVEFILTERS, time intelligence) Proficiency in SQL and solid understanding of tabular/starschema models Experience working with cloud platforms, ETL tools, and performance tuning Confident leading sessions with both technical and non-technical stakeholders Why Apply his is a key More ❯
development experience in a client-facing or consultancy setting Strong command of DAX (CALCULATE, SUMX, VAR, FILTER, REMOVEFILTERS, time intelligence) Proficiency in SQL and solid understanding of tabular/starschema models Experience working with cloud platforms, ETL tools, and performance tuning Confident leading sessions with both technical and non-technical stakeholders Why Apply his is a key More ❯
development experience in a client-facing or consultancy setting Strong command of DAX (CALCULATE, SUMX, VAR, FILTER, REMOVEFILTERS, time intelligence) Proficiency in SQL and solid understanding of tabular/starschema models Experience working with cloud platforms, ETL tools, and performance tuning Confident leading sessions with both technical and non-technical stakeholders Why Apply his is a key More ❯
london (city of london), south east england, united kingdom
Harnham
development experience in a client-facing or consultancy setting Strong command of DAX (CALCULATE, SUMX, VAR, FILTER, REMOVEFILTERS, time intelligence) Proficiency in SQL and solid understanding of tabular/starschema models Experience working with cloud platforms, ETL tools, and performance tuning Confident leading sessions with both technical and non-technical stakeholders Why Apply his is a key More ❯
teams and key stakeholders to identify, plan, develop and deliver data services to meet business requirements. Ensure appropriate documentation is produced mapping data from Source to Data Mart. Ensure Schema Diagrams are produced to inform report authors of the Data Model structure. Work with the Data Warehouse Lead and the wider team on the development of the data Warehouse … and supporting the team, enabling them to deliver departmental goals. Person specification Working in a Data Warehouse team creating ETL routines A working knowledge of Data Modelling techniques e.g. Starschema Previous experience of managing a team Essential Technical Skills: Experience of implementing Data Warehouse concepts. Experience of using Databases Experience of using ETL tools Desirable Technical Skills More ❯