london, south east england, united kingdom Hybrid / WFH Options
i3
in the dynamic London Market. 🔧 What You'll Be Doing Collaborate with the Lead Data Engineer and Business Analysts to define and deliver data solutions Design and support a starschema-based data warehouse Perform application data modelling and develop internal/external data integrations Build insightful and actionable Power BI dashboards and reports Provide proactive data support … Document solutions, processes, and technical decisions clearly ✅ What We’re Looking For Technical Skills Strong SQL Server and T-SQL expertise Deep understanding of data warehouse (DWH) modelling and starschema design Hands-on experience with Azure Data Factory, Pipelines, and Fabric Proficient with Power BI dashboard/report development Familiar with DevOps , Git , and modern CI/ More ❯
slough, south east england, united kingdom Hybrid / WFH Options
i3
in the dynamic London Market. 🔧 What You'll Be Doing Collaborate with the Lead Data Engineer and Business Analysts to define and deliver data solutions Design and support a starschema-based data warehouse Perform application data modelling and develop internal/external data integrations Build insightful and actionable Power BI dashboards and reports Provide proactive data support … Document solutions, processes, and technical decisions clearly ✅ What We’re Looking For Technical Skills Strong SQL Server and T-SQL expertise Deep understanding of data warehouse (DWH) modelling and starschema design Hands-on experience with Azure Data Factory, Pipelines, and Fabric Proficient with Power BI dashboard/report development Familiar with DevOps , Git , and modern CI/ More ❯
and orchestration tasks. Advanced database manipulation and administration skills, particularly T-SQL, Azure SQL, as well as capacity with unstructured and unlinked data management languages. Experience using and implementing schema and metadata design (e.g., starschema, snowflake, normal forms), with strong understanding of master and reference data management. Experience working with Azure/GCP building and managing More ❯
PySpark and Python Support the medallion architecture (bronze, silver, gold layers) to ensure a clean separation of raw, refined, and curated data Design and implement dimensional models such as star schemas and slowly changing dimensions Work closely with analysts, governance, and engineering teams to translate business requirements into data solutions Apply data governance and lineage principles to ensure documentation More ❯
data streaming Strong proficiency in SQL and Python Familiarity with Azure Data Services and CI/CD pipelines in a DevOps environment Solid understanding of data modelling techniques (e.g., StarSchema) Excellent problem-solving skills and a high attention to detail Desirable: Azure Data Engineer certification Experience working with unstructured data sources (e.g. voice) Exposure to Power BI More ❯
ensure data accuracy, quality, and security. What you’ll bring: Proven BI or Data experience with strong Power BI, DAX, and Power Query skills. SQL and data modelling expertise (starschema design). Strong Excel skills and knowledge of relational databases. Excellent problem-solving and communication skills, with high attention to detail. Experience with SSRS/SSIS/ More ❯
ensure data accuracy, quality, and security. What you’ll bring: Proven BI or Data experience with strong Power BI, DAX, and Power Query skills. SQL and data modelling expertise (starschema design). Strong Excel skills and knowledge of relational databases. Excellent problem-solving and communication skills, with high attention to detail. Experience with SSRS/SSIS/ More ❯
will: Digest data requirements, gather and analyse large scale structured data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Starschema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Understand and produce ‘Source to Target mapping More ❯
will: Digest data requirements, gather and analyse large scale structured data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Starschema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Understand and produce ‘Source to Target mapping More ❯
solutions are context-aware and business-aligned. Qualifications & Experience Technical: Proven hands-on experience with data platforms (Azure, AWS, Informatica). Strong understanding of data modelling approaches (e.g., Kimball, StarSchema, Data Vault). Proficiency in cloud and DataOps solutions, particularly within Azure or AWS ecosystems, including event streaming (Azure Event Hubs, Kafka). Experience with Big Data More ❯
context and operational needs. Qualifications & Skills Technical Expertise Proven hands-on experience with data platforms such as Azure, AWS, and Informatica. Strong knowledge of data modelling techniques (e.g., Kimball, StarSchema, Data Vault). Proficiency with cloud-native and DataOps solutions (Azure/AWS stack, event streaming with Azure Event Hubs, Kafka). Experience in Big Data solutions More ❯
context and operational needs. Qualifications & Skills Technical Expertise Proven hands-on experience with data platforms such as Azure, AWS, and Informatica. Strong knowledge of data modelling techniques (e.g., Kimball, StarSchema, Data Vault). Proficiency with cloud-native and DataOps solutions (Azure/AWS stack, event streaming with Azure Event Hubs, Kafka). Experience in Big Data solutions More ❯
london (city of london), south east england, united kingdom
Experis
context and operational needs. Qualifications & Skills Technical Expertise Proven hands-on experience with data platforms such as Azure, AWS, and Informatica. Strong knowledge of data modelling techniques (e.g., Kimball, StarSchema, Data Vault). Proficiency with cloud-native and DataOps solutions (Azure/AWS stack, event streaming with Azure Event Hubs, Kafka). Experience in Big Data solutions More ❯