City Of London, England, United Kingdom Hybrid / WFH Options
Fruition Group
Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensionalmodelling solutions. Excellent collaboration skills with a track record of working in Agile teams. Experience with Azure DevOps, GIT, and CI/CD pipelines. Comfortable liaising directly with More ❯
solutions to both technical and non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflake schema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor … optimization. •Programming/Statistical Analysis Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflake schema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts More ❯
Experience of designing and developing systems using microservices architectural patterns. DevOps experience in implementing development, testing, release, and deployment processes using DevOps processes. Knowledge in data modeling (3NF/Dimensional modeling/Data Vault2). Work experience in agile delivery. Able to provide comprehensive documentation. Able to set and manage realistic expectations for timescales, costs, benefits, and measures for More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Microsoft Azure ecosystem Exposure to or interest in: Microsoft Fabric and its evolving role in enterprise data platforms Azure DevOps for CI/CD and deployment T-SQL and dimensionalmodelling (Kimball methodology) Experience in Financial Services or Lloyd's market is a plus Apply now or get in touch to find out more - alexh@pioneer-search.com More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
business applications. Building/optimising data pipelines and integrations across cloud platforms. Strong hands-on SQL development skills including: MS SQL Server, T-SQL, indexing, stored procedures, relational/dimensionalmodelling, data dashboards. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers More ❯
developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational/dimensionalmodelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
at Landmarc, you'll need a strong foundation in data engineering, with proven experience as a data professional. You should have a solid background in data warehouse design particularly dimensionalmodelling and be confident working with data mining techniques. A deep understanding of database management systems, OLAP, and ETL frameworks is essential, along with hands-on experience using More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Medisoft Limited
with data extracts, data analysis and data validation as required The role requires: In-depth working knowledge and experience working with relational databases Background in data warehouse design (e.g. dimensionalmodelling) and data analysis Advanced MS SQL Server skillset (T-SQL, stored procedures, functions, dynamic SQL) Proven experience of developing complex reports using SSRS or closely equivalent tools More ❯
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
VirtueTech Recruitment Group
central Data Platform, working closely with the Data Management Team. Analysis of normalised transactional data and analytic data sources. Analysis of semi structured and file-based data extracts. Data modelling to transform operational data into analytic/reporting structures such as Kimball style multi-dimensional models. Essential Skills for the Data Analyst: Proven Data Analyst experience of at … least 5 years, within a Financial Services environment, with particular emphasis on Risk or cross asset derivatives areas. Data modelling, cleansing and enrichment, with experience in conceptual, logical and physical data modelling and data cleansing and standardisation. Familiarity with Data warehousing and analytical data structures. Data quality assurance, validation and linkage. Experience creating BI models and dashboards, ideally More ❯
central Data Platform, working closely with the Data Management Team. Analysis of normalised transactional data and analytic data sources. Analysis of semi structured and file-based data extracts. Data modelling to transform operational data into analytic/reporting structures such as Kimball style multi-dimensional models. Essential Skills for the Data Analyst: Proven Data Analyst experience of at … least 5 years, within a Financial Services environment, with particular emphasis on Risk or cross asset derivatives areas. Data modelling, cleansing and enrichment, with experience in conceptual, logical and physical data modelling and data cleansing and standardisation. Familiarity with Data warehousing and analytical data structures. Data quality assurance, validation and linkage. Experience creating BI models and dashboards, ideally More ❯
Technology - Deep knowledge of MS Data Platform technologies. - On prem and Azure knowledge - A good understating of all things data - data integration, warehousing, advanced analytics, etc - Strong knowledge of dimensionalmodelling techniques - Familiarity of data engineering tools and automation practices (CI/CD pipelines, DevOps etc) This is an excellent role for a proven Senior Data Services Manager More ❯
long-term team growth and sustainability. Technical Expertise: Extensive experience designing and implementing scalable, metadata-driven data solutions, optimised for analytical consumption and operational robustness. Deep expertise in data modelling, specifically using star schema methodology, and building performant dimensional models to support high-velocity datasets. Strong experience with Google Cloud Platform (GCP), including BigQuery, Dataflow, Composer (Apache Airflow More ❯