About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation 🧠 What You’ll Do: Design and implement … BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensionalmodelling, layered architecture, and data quality ✅ What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms … like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non-technical teams Nice to have: Experience with Power BI or similar BI tools Familiarity with CI/CD practices in data environments Exposure to data More ❯
and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Star schema and dimensionalmodelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with initiative and enthusiasm More ❯
and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Star schema and dimensionalmodelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with initiative and enthusiasm More ❯
business requirements and high-level designs. Ensure alignment of low-level designs with application architecture, high-level designs, and AA Standards, Frameworks, and Policies. Analyse data sets to identify modelling logic and key attributes required for low-level design, and create and maintain appropriate documentation. Develop and update Physical Data Models (PDMS) and participate in design reviews. Lead handover … What do I need? Experienced with data warehouse and business intelligence, including delivering low-level ETL design and physical data models. Proficient in Data Warehousing Design Methodologies (e.g., Kimball dimensional models) and Data Modelling tools (e.g., ER Studio). Strong Data Analysis skills and hands-on experience with SQL/Python for data interrogation. Working knowledge of Cloud More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Fruition Group
Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensionalmodelling solutions. Excellent collaboration skills with a track record of working in Agile teams. Experience with Azure DevOps, GIT, and CI/CD pipelines. Comfortable liaising directly with More ❯
solutions to both technical and non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflake schema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor … optimization. •Programming/Statistical Analysis Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflake schema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts More ❯
Experience of designing and developing systems using microservices architectural patterns. DevOps experience in implementing development, testing, release, and deployment processes using DevOps processes. Knowledge in data modeling (3NF/Dimensional modeling/Data Vault2). Work experience in agile delivery. Able to provide comprehensive documentation. Able to set and manage realistic expectations for timescales, costs, benefits, and measures for More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
business applications. Building/optimising data pipelines and integrations across cloud platforms. Strong hands-on SQL development skills including: MS SQL Server, T-SQL, indexing, stored procedures, relational/dimensionalmodelling, data dashboards. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers More ❯
developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational/dimensionalmodelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
at Landmarc, you'll need a strong foundation in data engineering, with proven experience as a data professional. You should have a solid background in data warehouse design particularly dimensionalmodelling and be confident working with data mining techniques. A deep understanding of database management systems, OLAP, and ETL frameworks is essential, along with hands-on experience using More ❯
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯