London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensionalmodelling solutions. Excellent collaboration skills with a track record of working in Agile teams. Experience with Azure DevOps, GIT, and CI/CD pipelines. Comfortable liaising directly with More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Avanti
in SQL and Python for data transformation and workflow automation Experience with AWS data tools (e.g. Redshift, Glue, Lambda, S3 ) and infrastructure tools such as Terraform Understanding of data modelling concepts (e.g. dimensional models, star/snowflake schemas) Knowledge of data quality, access controls , and compliance frameworks Nice to Have Experience with orchestration or pipeline frameworks like Airflow More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensionalmodelling solutions. Excellent collaboration skills with a track record of working in Agile teams. Experience with Azure DevOps, GIT, and CI/CD pipelines. Comfortable liaising directly with More ❯
Get AI-powered advice on this job and more exclusive features. Direct message the job poster from Cpl Life Sciences Business Intelligence (Data modelling, data warehousing, Dashboarding) SQL & Python AWS (S3, Lambda, Glue, Redshift) The Senior Business Intelligence Engineer occupies a unique role at the intersection of technology, marketing, finance, statistics, data mining, and social science. We provide the … using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
solutions to both technical and non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflake schema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor … optimization. •Programming/Statistical Analysis Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflake schema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts More ❯
with MS SQL Server , T-SQL , and performance tuning for reporting workloads. An understanding of SSRS and SSIS for traditional reporting and ETL processes. Data Warehousing Concepts : Understanding of dimensional modeling, fact and dimension tables. Solid understanding of data visualisation principles and dashboard design best practices. Familiarity with Azure DevOps version control for Power BI and SQL development. Performance More ❯
CDK. Proficiency in ETL/ELT processes and best practices. Experience with data visualization tools (e.g., Quicksight). Required Skills: Strong analytical and problem-solving abilities. Excellent understanding of dimensional modeling and star schema design (facts, dimensions, SCD Type 2). Experience with agile development methodologies. Strong communication skills and ability to work with cross-functional teams. Background in More ❯
CDK. Proficiency in ETL/ELT processes and best practices. Experience with data visualization tools such as Quicksight. Required Skills: Strong analytical and problem-solving abilities. Excellent understanding of dimensional modeling and star schema design (facts, dimensions, SCD type 2). Experience with agile development methodologies. Strong communication skills and ability to collaborate with cross-functional teams. Background in More ❯
Deep expertise in designing and implementing enterprise BI and analytics architectures. Proven experience with modern cloud data warehouse platforms, preferably Snowflake. Advanced proficiency in data modeling, data warehousing, and dimensional modeling concepts. Strong command of data transformation and pipeline tools, such as DBT, Apache Spark, or equivalent. Expertise in implementing data governance frameworks, including data quality management, metadata management More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Microsoft Azure ecosystem Exposure to or interest in: Microsoft Fabric and its evolving role in enterprise data platforms Azure DevOps for CI/CD and deployment T-SQL and dimensionalmodelling (Kimball methodology) Experience in Financial Services or Lloyd's market is a plus Apply now or get in touch to find out more - alexh@pioneer-search.com More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
useful but not mandatory – we'll help you get those! Core: SQL Server (T-SQL and DBA) Microsoft Fabric Power BI, DAX Supporting: Python, PySpark Microsoft Purview Relational and DimensionalModelling Data Warehouse Theory Data platform Architecture models such as Lakehouse Data Science Master Data Management Data Governance We don’t believe hiring is a tick box exercise More ❯
to-end BI development projects using Microsoft BI stack (SSIS, SSAS, SSRS) Design and implement complex SQL queries, stored procedures, and performance tuning Contribute to data warehouse architecture and dimensional modeling Work within OLAP and ETL frameworks to support scalable BI solutions Collaborate with cross-functional teams to translate business needs into technical solutions Required Skills & Experience: Proven experience More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
experience; expert level 1+ years of dbt experience preferred Familiarity with BI tools such as Tableau, Looker, or similar Excellent presentation and communication skills Experience in Data Architecture for Dimensional Models, Data Lakes, and Data Lakehouses Experience working in event-driven architectures and cloud platforms (AWS, Azure, GCP) Experience with Snowflake, Redshift, and ETL tools like Fivetran or Stitch More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
s detail-oriented, solution-driven, and pragmatic - someone who takes ownership of their work and is excited to build product-focused data models. What You’ll Work On Data Modelling & Transformation Build and maintain dbt models to transform raw data into clean, documented, and accessible data sets Translate business and analytics requirements into scalable data models Design and implement … data warehouse schemas using dimensionalmodelling techniques (fact and dimension tables, slowly changing dimensions, etc.) Participate in design and code reviews to improve model design and query performance Testing, Documentation, and CI/CD Implement and maintain dbt tests to ensure data quality and model accuracy Document data models clearly to support cross-functional use Use GitHub and … use What We’re Looking For Required Skills & Experience 2+ years of building and optimising complex SQL (including complex joins, window functions and optimisation methods) Strong understanding of data modelling and warehouse design (e.g., Kimball-style dimensionalmodelling) Experience using dbt in production environments, including testing and documentation Familiar with version control (GitHub) Experience tuning dbt models More ❯