Strong Python skills Experience with version control for data models (e.g., dbt testing frameworks, data documentation) Demonstrated experience with data modelling concepts (dimensional modelling, star schemas) Experience writing efficient and optimised SQL for large-scale data transformations Understanding of data warehouse design principles and best practices Proactive approach to More ❯
order to stay ahead of the curve. Expertise in writing SQL (clean, fast code is a must) and in data-warehousing concepts such as star schemas, slowly changing dimensions, ELT/ETL, and MPP databases Experience in transforming flawed/changing data into consistent, trustworthy datasets, and in developing More ❯
PowerBI reports Using Power Query, M and DAX to create measures, calculated columns and tables. Modelling the data to produce efficient reports using a Starschema and understanding and creating the appropriate relationships within the model. Monitor and maintain the refresh and usage of the live reports on More ❯
Key Knowledge and Skills: ETL/ELT & Data Warehousing: Detailed knowledge of methodologies and best practices, including big data, cloud technology, and unstructured data. Schema Design: Proficiency in starschema structure & design and understanding of Kimball & Inmon hybrid data warehouse design. Cloud Data Products: Experience with Data More ❯
and maintain conceptual, logical, and physical data models that support both operational and analytical systems. Define and enforce best practices in data modelling (e.g., starschema, snowflake, graph, relational and non-relational structures). Lead the integration of structured and unstructured data across multiple sources, ensuring consistency, integrity More ❯
Strong Python skills • Experience with version control for data models (e.g., dbt testing frameworks, data documentation) • Demonstrated experience with data modelling concepts (dimensional modelling, star schemas) • Experience writing efficient and optimised SQL for large-scale data transformations • Understanding of data warehouse design principles and best practices • Proactive approach to More ❯
to analyse data; Experience with Microsoft databases and analytical tools, including Fabric or Synapse Analytics; Experience in data modelling (creation of E-R Diagrams, star schemas) and use of a modelling tool e.g. Erwin; Knowledge of data cataloguing tools e.g. Purview; What sets us apart? Global Impact: With offices More ❯
customers. About you: Expertise and experience of Apache Spark, PySpark, Python based pipeline jobs. Solid Data Lake/Data Warehouse principles, techniques and technologies - StarSchema, SQL (AWS EMR, Apache Iceberg, parquet). Strong database skills and experience is required, we have NoSQL databases as well as relational More ❯
Experience with Terraform Experience with version control for data models (e.g., dbt testing frameworks, data documentation) Demonstrated experience with data modelling concepts (dimensional modelling, star schemas) Experience writing efficient and optimised SQL for large-scale data transformations Understanding of data warehouse design principles and best practices Proactive approach to More ❯
Strong Python skills • Experience with version control for data models (e.g., dbt testing frameworks, data documentation) • Demonstrated experience with data modelling concepts (dimensional modelling, star schemas) • Experience writing efficient and optimised SQL for large-scale data transformations • Understanding of data warehouse design principles and best practices • Proactive approach to More ❯
Strong Python skills • Experience with version control for data models (e.g., dbt testing frameworks, data documentation) • Demonstrated experience with data modelling concepts (dimensional modelling, star schemas) • Experience writing efficient and optimised SQL for large-scale data transformations • Understanding of data warehouse design principles and best practices • Proactive approach to More ❯