the organizations systems and processes. The team has a variety of skills and experience in technical domains such as Software development, Automation, Data-pipeline-modelling, IT project management, Data visualisation. We also pride ourselves on our close integration with non-technical/non-data teams as our success comes … GCP • Experience with Terraform • Strong Python skills • Experience with version control for data models (e.g., dbt testing frameworks, data documentation) • Demonstrated experience with data modelling concepts (dimensionalmodelling, star schemas) • Experience writing efficient and optimised SQL for large-scale data transformations • Understanding of data warehouse design principles … working with large-scale spatial datasets (billions of rows) and performing geospatial at scale using BigQuery GIS or similar tools • Experience with advanced analytical modelling techniques, including statistical analysis and predictive modelling, particularly applying these to large-scale datasets to derive actionable insights • Knowledge of data governance and More ❯
Bachelor's degree with majors in Computer Science, Information Systems, Statistics, Mathematics & Engineering Minimum 4 years of experience as a BI developer responsible for: Dimensionalmodelling (Kimball Methodology) Data Modelling (Kimball Methodology) ELT/ETL Dataverse (Data Flows) Data Factory Pipeline Development Dashboarding and Reporting (PBI) Data More ❯
Analytics. Proficiency in Microsoft Azure’s data and integration services, and experience working with APIs to enable cloud-based data flows. Strong experience in dimensionalmodelling and formal database design. A good understanding of data governance, quality, and security principles. Ability to work both independently and collaboratively, with More ❯
Analytics. Proficiency in Microsoft Azure’s data and integration services, and experience working with APIs to enable cloud-based data flows. Strong experience in dimensionalmodelling and formal database design. A good understanding of data governance, quality, and security principles. Ability to work both independently and collaboratively, with More ❯
S3, Snowflake, Athena, Glue), In depth understanding of database structure principles, Strong knowledge of database structure systems and data mining, Excellent understanding of Data Modelling (ERwin, PowerDesigner, DimensionalModelling) An understanding of SQL/database management. Strong hands-on experience in Data Warehouse and Data Lake technologies More ❯
with over 8 years of experience within this domain, ideally in Oil and Gas. Responsibilities: Requirements capture & assessment Conceptual/logical/physical data modelling/dimensionalmodelling Data integration & flow design SOA architecture & principles Query & database performance tuning - ETL Cloud computing Globally-distributed database replication/ More ❯
and emerging technologies, including data integration, data warehousing and advanced analytics platforms, with the ability to assess and recommend appropriate solutions. Strong knowledge of dimensionalmodelling techniques, including Kimball¿s Business Dimensional Lifecycle, with the ability to provide oversight and guidance in applying these concepts effectively. Demonstrated More ❯
and emerging technologies, including data integration, data warehousing and advanced analytics platforms, with the ability to assess and recommend appropriate solutions. Strong knowledge of dimensionalmodelling techniques, including Kimball¿s Business Dimensional Lifecycle, with the ability to provide oversight and guidance in applying these concepts effectively. Demonstrated More ❯
Microsoft BI tools with Tableau, Amazon QuickSight , or similar platforms Understanding of REST APIs , Power BI Embedded , and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensionalmodelling , star/snowflake schemas , and data warehouse best More ❯
business facing experience. Preferably in the insurance sector. Key responsibilities: Design Data Lake and Data Warehouse solutions. Design Data Models using Data Vault and Dimensionalmodelling methods Implement automated, reusable and efficient batch data pipelines and streaming data pipelines Work closely with Governance and Quality teams to ensure More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Eden Smith Group
business facing experience. Preferably in the insurance sector. Key responsibilities: Design Data Lake and Data Warehouse solutions. Design Data Models using Data Vault and Dimensionalmodelling methods Implement automated, reusable and efficient batch data pipelines and streaming data pipelines Work closely with Governance and Quality teams to ensure More ❯
Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of More ❯
user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Star schema and dimensionalmodelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self More ❯
data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service … Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and More ❯
london, south east england, United Kingdom Hybrid / WFH Options
McCabe & Barton
data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service … Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and More ❯
systems using microservices architectural patterns. DevOps experience in implementing development, testing, release, and deployment processes using DevOps processes. Knowledge in data modeling (3NF/Dimensional modeling/Data Vault2). Work experience in agile delivery. Able to provide comprehensive documentation. Able to set and manage realistic expectations for timescales More ❯
BI. Solid hands-on experience with Azure Databricks - Pyspark coding and Spark SQL coding - Must have. Very good knowledge of data warehousing skills including dimensional modeling, slowly changing dimension patterns, and time travel. Experience in delivering Agile projects with implementation knowledge on CI/CD pipelines, preferably on Azure More ❯
necessary. Perform SQL and ETL tuning as necessary. Basic Qualifications: 5+ years of relevant data engineering experience Strong understanding of data modeling principles including dimensional modeling & data normalization principles. Good understanding of SQL Engines and able to conduct advanced performance tuning. 2+ years of work experience implementing and reporting More ❯
Alexander Mann Solutions - Public Sector Resourcing
postholder in situ), preference may be given to candidates who meet all of the essential criteria and hold active security clearance. MoJ's Data Modelling and Engineering Team is made up of Data Engineers and Analytics Engineers: our Data Engineers build and maintain pipelines to extract from operational systems … and load data to our Analytical Platform and our Analytics Engineers model this data using dimensionalmodelling principles to provide useful and usable data to downstream users, including analysts, performance teams and data scientists. Over time, this supports migration towards robust, automated and maintainable downstream processes which deliver … downstream data systems, exploring a new data layer for LAA, and conducting detailed analysis of user needs in a single area to feed into dimensional model design. By developing in a flexible manner with the right level of abstraction this will enable expansion to bring a full range of More ❯
Azure T-SQL Development (MS SQL Server 2005 onwards) Python, PySpark Experience of the following systems would also be advantageous: Azure DevOps MDS Kimball DimensionalModelling Methodology Power Bi Unity Catalogue Microsoft Fabric Experience of the following business areas would be advantageous: Insurance sector (Lloyds Syndicate, Underwriting, Broking More ❯
technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components More ❯
technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components More ❯
private equity, private credit, and real estate data-including metrics, KPIs, and market trends. Background in data modeling and data management best practices (e.g., dimensional modeling, data governance, metadata management). Ability to provide insights on complex financial data sets and collaborate with relevant stakeholders. WHAT WE OFFER: We More ❯
SSIS, SSAS, SSRS) to high level of proficiency - Essential C# Programming/JavaScript programming - Desirable Knowledge and Skills Software analysis and design good practice Dimensionalmodelling, Entity Relationship modelling, Normalised modelling. Data warehouse design concepts, (Inmon, Kimball) The position is based in Central and you will be More ❯
SSIS, SSAS, SSRS) to high level of proficiency - Essential C# Programming/JavaScript programming - Desirable Knowledge and Skills Software analysis and design good practice Dimensionalmodelling, Entity Relationship modelling, Normalised modelling. Data warehouse design concepts, (Inmon, Kimball) The position is based in Central and you will be More ❯