support decision-making, performance tracking, and regulatory needs across corporate functions. Implement and maintain robust data models across key domains using best practices in dimensional modeling, normalization, and semantic layering. Standardize data acquisition, onboarding, ingestion, transformation and distribution frameworks globally to optimize scalability, open architecture and delivery speed. Support More ❯
support decision-making, performance tracking, and regulatory needs across corporate functions. Implement and maintain robust data models across key domains using best practices in dimensional modeling, normalization, and semantic layering. Standardize data acquisition, onboarding, ingestion, transformation and distribution frameworks globally to optimize scalability, open architecture and delivery speed. Support More ❯
Bachelor's degree with majors in Computer Science, Information Systems, Statistics, Mathematics & Engineering Minimum 4 years of experience as a BI developer responsible for: Dimensionalmodelling (Kimball Methodology) Data Modelling (Kimball Methodology) ELT/ETL Dataverse (Data Flows) Data Factory Pipeline Development Dashboarding and Reporting (PBI) Data More ❯
with over 8 years of experience within this domain, ideally in Oil and Gas. Responsibilities: Requirements capture & assessment Conceptual/logical/physical data modelling/dimensionalmodelling Data integration & flow design SOA architecture & principles Query & database performance tuning - ETL Cloud computing Globally-distributed database replication/ More ❯
Microsoft BI tools with Tableau, Amazon QuickSight , or similar platforms Understanding of REST APIs , Power BI Embedded , and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensionalmodelling , star/snowflake schemas , and data warehouse best More ❯
Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of More ❯
systems using microservices architectural patterns. DevOps experience in implementing development, testing, release, and deployment processes using DevOps processes. Knowledge in data modeling (3NF/Dimensional modeling/Data Vault2). Work experience in agile delivery. Able to provide comprehensive documentation. Able to set and manage realistic expectations for timescales More ❯
data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service … Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and More ❯
london, south east england, united kingdom Hybrid / WFH Options
McCabe & Barton
data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service … Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and More ❯
BI. Solid hands-on experience with Azure Databricks - Pyspark coding and Spark SQL coding - Must have. Very good knowledge of data warehousing skills including dimensional modeling, slowly changing dimension patterns, and time travel. Experience in delivering Agile projects with implementation knowledge on CI/CD pipelines, preferably on Azure More ❯
Azure T-SQL Development (MS SQL Server 2005 onwards) Python, PySpark Experience of the following systems would also be advantageous: Azure DevOps MDS Kimball DimensionalModelling Methodology Power Bi Unity Catalogue Microsoft Fabric Experience of the following business areas would be advantageous: Insurance sector (Lloyds Syndicate, Underwriting, Broking More ❯
technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components More ❯
technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components More ❯
private equity, private credit, and real estate data-including metrics, KPIs, and market trends. Background in data modeling and data management best practices (e.g., dimensional modeling, data governance, metadata management). Ability to provide insights on complex financial data sets and collaborate with relevant stakeholders. WHAT WE OFFER: We More ❯
relationships with key data creators, data owners and data consumers. Ensure data assets are properly defined and maintained within a central data catalogue. Data modelling to transform operational data into analytic/reporting structures such as Kimball style multi-dimensional models. Take ownership of data issues through to … easily understood and used. Locate and define new data-related process improvement opportunities. Skills and Experience: Essential: • Experience managing/leading a team. • Data modelling, cleansing and enrichment, with experience in conceptual, logical, and physical data modelling. • Familiarity with data warehouses and analytical data structures. • Experience of data quality … software. • Knowledge of Orchestration Tools and processes (e.g SSIS, Data Factory, Alteryx) • Power BI Development including the data model, DAX, and visualizations. • Relational and Dimensional (Kimball) data modelling • Proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Desirable: • Databricks (or Alternative Modern Data Platform such as Snowflake More ❯
BI Developer (SSIS, SSAS, SSRS) Expert in SQL. Should be able to write complex, nested queries, stored procedures. Background in data warehouse design (e.g. dimensionalmodelling) and data mining In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework AWS awareness More ❯
london, south east england, united kingdom Hybrid / WFH Options
Falcon Smart IT (FalconSmartIT)
BI Developer (SSIS, SSAS, SSRS) Expert in SQL. Should be able to write complex, nested queries, stored procedures. Background in data warehouse design (e.g. dimensionalmodelling) and data mining In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework AWS awareness More ❯
london, south east england, united kingdom Hybrid / WFH Options
Ampstek
BI Developer (SSIS, SSAS, SSRS) Expert in SQL. Should be able to write complex, nested queries, stored procedures. Background in data warehouse design (e.g., dimensionalmodelling) and data mining In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework AWS awareness More ❯
in data architecture and modeling. Proven experience in data modeling, data architecture, and data product design. Proficiency in data modeling standards and techniques (e.g., dimensional model, 3NF, Vault 2.0). Experience with analytical and real-time/streaming data solutions. Hands-on experience with data modeling tools (e.g., Erwin More ❯
Data Platform technologies. - On prem and Azure knowledge - A good understating of all things data - data integration, warehousing, advanced analytics, etc - Strong knowledge of dimensionalmodelling techniques - Familiarity of data engineering tools and automation practices (CI/CD pipelines, DevOps etc) This is an excellent role for a More ❯
Collaborating with analytics and data scientists to understand their needs and build models that drive insights and decision making Champion best practices for data modelling, transformation, and governance, in your immediate time and across the business Consult on our evolving data platform, advising on the tooling, infrastructure and practices … efficient and well structured queries Hands on experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensionalmodelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or More ❯
concepts. Knowledge of agile methodologies and use of the Rally application. Working knowledge of creation of Entity Relationship Diagrams and Data flow diagrams and Dimensional modelling. Working knowledge of Exploratory data analysis techniques. Working knowledge of Quality Center. Working knowledge: understands basic principles and terminology in order to understand More ❯
2+ years of experience in PySpark. Knowledge of GCP Kubernetes would be an advantage. Knowledge of data warehousing concepts with a good understanding of dimensional models. Experience in implementing a metadata framework for data ingestion, data quality, and ETL. Good communication skills and ability to manage IT stakeholders. More ❯
designs. Ensure alignment of low-level designs with application architecture, high-level designs, and AA Standards, Frameworks, and Policies. Analyse data sets to identify modelling logic and key attributes required for low-level design, and create and maintain appropriate documentation. Develop and update Physical Data Models (PDMS) and participate … with data warehouse and business intelligence, including delivering low-level ETL design and physical data models. Proficient in Data Warehousing Design Methodologies (e.g., Kimball dimensional models) and Data Modelling tools (e.g., ER Studio). Strong Data Analysis skills and hands-on experience with SQL/Python for data More ❯
of translating sophisticated business challenges into effective solution designs for clients. Your Impact Convert existing solutions, spreadsheets, and business problems into advanced Anaplan multi-dimensional models. Adjust existing models to optimize or incorporate new functionality as part of a connected solution. Act as the architectural SME for large-scale More ❯