Exeter, Devon, United Kingdom Hybrid / WFH Options
Livewest
Power BI reports, dashboards and datasets that drive actionable insights using DAX and SQL to support analytical needs. Transform business requirements into technical specifications, applying STAR schema and Kimball dimensionalmodelling to optimise data structures for performance and usability. Collaborate with stakeholders to decentralise data access and empower decision-makers. Optimise performance of data models and reports through … sources, data cubes and data models. Experience with a Data Warehouse development lifecycle and an understanding of development, test and/or production environments. Experience with concepts such as dimensionalmodelling and data transformation. To be educated to degree level, or possess equivalent professional qualifications and experience. Other organisations may call this role Power BI Developer, Business Intelligence More ❯
with Power BI is an essential requirement for this role SQL and Excel experience is essential to support the running of Internal and external reporting Knowledge and experience of dimensionalmodelling, creating new, interactive and engaging reports based on stakeholder requirements Knowledge and use in at least one analytical programming language (Python (preferred), R or Julia) this is … desirable Knowledge and experience of dimensionalmodelling with the ability to optimise workflows and analysis for map reduce processing Please contact John here at ISR to learn more about our exciting client based in Exeter, Devon and their ongoing growth plans More ❯
Employment Type: Permanent
Salary: £40000 - £45000/annum (plus excellent company benefits)
Hands-on expertise in Azure ecosystem, including components like Azure Data Factory, Azure Data Lake Storage, Azure, SQL, Azure DataBricks, HD Insights, ML Service etc. Expertise in relational and dimensionalmodelling, including big data technologies. Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus. In terms of business responsibilities - Client Engagement & Business Growth with … Hands-on expertise in Azure ecosystem, including components like Azure Data Factory, Azure Data Lake Storage, Azure, SQL, Azure DataBricks, HD Insights, ML Service etc. Expertise in relational and dimensionalmodelling, including big data technologies. Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus. Nice to have: Client Engagement & Business Growth with a heavy More ❯
the mobile/telecoms industry would be a bonus! Key outputs for the role • Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies. • Translate business requirements from stakeholders into robust, well-documented and tested dbt models. • Develop and own workflows within Google Cloud Platform environments, primarily … and implement robust, well-documented and performant dbt models that serve as a single source of truth for business reporting. • Implement and champion data quality testing, documentation standards, and modelling best practices within dbt projects. • Troubleshoot and resolve any issues or errors in the data pipelines. • Stay updated with the latest cloud technologies and industry best practices to continuously … be expecting to see: • Expert level proficiency in SQL. • Deep, practical experience of building and architecting data models with dbt. • A strong understanding of data warehousing concepts and data modelling techniques (e.g., Kimball, DimensionalModelling, One Big Table). • Solid, hands-on experience within the Google Cloud Platform, especially with BigQuery. • Proven experience working directly with business More ❯
and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensionalmodelling , and data architecture . Experience working with Delta Lake and large-scale data processing. Experience building ETL pipelines in Azure Data Factory or similar orchestration tools. Familiarity More ❯
and data analysis, ideally including both raw and aggregated data with the ability to review transformation logic Solid understanding of the full Software Development Lifecycle (SDLC) Appreciation of data modelling techniques (e.g. dimensionalmodelling, data vault) Strong knowledge of BCBS239 regulations and their practical applications. Experience in analysing business processes and delivering regulatory solutions. Proficiency in creating More ❯
data solutions and making strategic tech decisions without losing sight of day-to-day pragmatism. Familiarity with CI/CD, DevOps, data governance and responsible AI practices. Strong data modelling skills - e.g. dimensionalmodelling Experience leveraging AI tooling to accelerate and automate tasks Experience with Data Mesh practices is a plus Confidence in setting direction, managing ambiguity More ❯
Baginton, Warwickshire, United Kingdom Hybrid / WFH Options
Arden University
An understanding of databasetheory and design. Experience withSQL and NoSQL databases. Familiarity with big datatechnologies and ecosystems suchas MicrosoftSynapse/Fabric,Hadoop, Spark, Kafka, and others. Skills in data modelling and datawarehousing solutions. Experienceof dimensionalmodelling (Kimball). Proven Experience in designing and developing ETL/ELT processes. Knowledge of data pipeline toolssuch as Data Factory, Airflow More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products end-to-end, from … design to continuous improvement Promote innovation and best practices in data engineering About You: Strong experience with SQL, Python, and BI tools (e.g., Power BI) Solid understanding of dimensionalmodelling and data architecture Experience working in governed, decentralised data environments Excellent communication and stakeholder engagement skills Analytical mindset with a focus on delivering business value If you are More ❯
Woking, Surrey, United Kingdom Hybrid / WFH Options
Michael Page
should have: Experience with Azure Data Factory, Azure Synapse, Azure SQL, or Azure Data Lake. Hands-on knowledge of the ETL process and working with large datasets. Understanding of dimensionalmodelling and data warehousing principles. Familiarity with CI/CD pipelines or monitoring tools for data processes. Solid skills in SQL and basic knowledge of Python scripting. Exposure More ❯
with the potential for extension. This role offers a hybrid working arrangement, requiring 1-2 days per week onsite at Heathrow, Hounslow, with on-site parking available. Responsibilities: Data Modelling: Design and optimize star schema data models tailored to our client's business needs for streamlined analytics and reporting. Collaboration: Work closely with data architects, BI developers, and business … with business goals. Data Quality & Governance: Establish data quality checks and governance practices to ensure accuracy and integrity within data models. Skills/Must have: Proven experience in data modelling using Kimball methodology, with a focus on dimensionalmodelling and star schemas. Strong proficiency in SQL and experience with data modelling tools like ER Studio, Power More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business requirements Experience of developing using an Agile More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Star schema/dimensional modelling. Excellent SQL skills Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business requirements Experience of developing using an Agile More ❯
developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational/dimensionalmodelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. More ❯
Slough, Berkshire, United Kingdom Hybrid / WFH Options
Halton Housing
Here at Halton Housing, we are looking for an experienced Data Developer to work across our vibrant organisation. What You'll Do: Coding DAX Measures and Dimensional Models Developing & delivering visually compelling Power BI Dashboards & Reports to specification Developing and maintaining SSRS reports Developing & maintaining ETL pipeline solutions in Azure Data Factory and SSIS, utilising Azure Data Lake & Dev More ❯
experience. Strong background in System Integration, Application Development, or Data-Warehouse projects across enterprise technologies. Experience with Object-oriented languages (e.g., Python, PySpark) and frameworks. Expertise in relational and dimensional modeling, including big data technologies. Proficiency in Microsoft Azure components like Azure Data Factory, Data Lake, SQL, DataBricks, HD Insights, ML Service. Good knowledge of Python and Spark. Experience More ❯
or equivalent. MS SQL stack (SSIS, SSAS, SSRS) to high level of proficiency - Essential C# Programming/JavaScript programming - Desirable Knowledge and Skills Software analysis and design good practice Dimensionalmodelling, Entity Relationship modelling, Normalised modelling. Data warehouse design concepts, (Inmon, Kimball) The position is based in Central and you will be required to be in the More ❯
or equivalent. MS SQL stack (SSIS, SSAS, SSRS) to high level of proficiency - Essential C# Programming/JavaScript programming - Desirable Knowledge and Skills Software analysis and design good practice Dimensionalmodelling, Entity Relationship modelling, Normalised modelling. Data warehouse design concepts, (Inmon, Kimball) The position is based in Central and you will be required to be in the More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
PowerBI/Tableau developer, with a solid track record in delivering complex solutions. They will be proficient in desktop and cloud, with expert level experience in calculated metrics, data modelling, and optimising reports. You will Lead the design and development of visualisation dashboards that meet complex data requirements and present information in a clear and concise manner Drive data … design with business users Ability to perform data blending, data aggregation and complex calculated fields when required Possess hands on experience with SQL and knowledge of data warehousing and dimensionalmodelling will be advantageous Experience using large data platforms such as Snowflake, Databricks or similar Exposure to other visualisation platforms is helpful, but not essential Required Characteristics Cares More ❯
data science, or recommendation/personalisation data products. Experience of cloud based data warehouses and architectures e.g. GCP. A solid understanding of data warehousing: data flows, ETL, data marts, dimensionalmodelling, data quality/data accuracy, data operation. Strong SQL skills and a good understanding of how to interrogate diverse data sources. Can analyse and interpret data to More ❯
them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
Type: Permanent Key Responsibilities Build and maintain Snowflake-based data platforms. Collaborate with product owners and stakeholders to gather requirements and deliver scalable solutions. Ensure best practices in data modelling, governance, and architecture are applied across projects. Essential Skills Strong knowledge of cloud computing and Snowflake architecture. Proficient in SQL (ANSI-SQL). Solid experience with data models: Dimensional … Background in investment banking or financial services. Familiarity with Power BI and statistics. Ability to support multiple concurrent projects. Tech Stack: Snowflake, ANSI-SQL, Python, GitLab, Snowpark, Cloud Platforms, DimensionalModelling This is a great opportunity to join a well-established consultancy and work on advanced cloud data solutions in the financial sector. Randstad Technologies Ltd is a More ❯
Type: Permanent Key Responsibilities Build and maintain Snowflake-based data platforms. Collaborate with product owners and stakeholders to gather requirements and deliver scalable solutions. Ensure best practices in data modelling, governance, and architecture are applied across projects. Essential Skills Strong knowledge of cloud computing and Snowflake architecture. Proficient in SQL (ANSI-SQL). Solid experience with data models: Dimensional … Background in investment banking or financial services. Familiarity with Power BI and statistics. Ability to support multiple concurrent projects. Tech Stack: Snowflake, ANSI-SQL, Python, GitLab, Snowpark, Cloud Platforms, DimensionalModelling This is a great opportunity to join a well-established consultancy and work on advanced cloud data solutions in the financial sector. Randstad Technologies Ltd is a More ❯
They closely collaborate with data scientists, analysts, and software engineers to ensure efficient data processing, storage, and retrieval for business insights and decision-making. From their expertise in data modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data … in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
warehouse using dbt Perform descriptive analysis using Looker visualisations and dashboards Skills Extensive experience with SQL Deep experience with handling data in Python Experience with (or willingness to learn!) dimensionalmodelling and dbt Familiarity with a BI tool such as Power BI or Looker Great communication skills A creative thinker with a keen attention to detail Interview process More ❯