the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google Cloud Platform environments, primarily … and implement robust, well-documented and performant dbt models that serve as a single source of truth for business reporting.• Implement and champion data quality testing, documentation standards, and modelling best practices within dbt projects.• Troubleshoot and resolve any issues or errors in the data pipelines.• Stay updated with the latest cloud technologies and industry best practices to continuously … be expecting to see: • Expert level proficiency in SQL.• Deep, practical experience of building and architecting data models with dbt.• A strong understanding of data warehousing concepts and data modelling techniques (eg, Kimball, DimensionalModelling, One Big Table).• Solid, hands-on experience within the Google Cloud Platform, especially with BigQuery.• Proven experience working directly with business More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
delivering enterprise-level solutions. Essential Skills: 5 years' experience in Data Engineering. Strong expertise in Databricks, Azure Data Factory, Azure SQL, and Azure Synapse/DW. Solid understanding of dimensionalmodelling (Kimball, Star Schema) and EDW solutions. Experience working with structured and unstructured data. Familiarity with cloud and DevOps practices - ie Azure, CI/CD pipelines, scaling, cost More ❯
engagement and communication skills Essential technical skills: SQL, Power BI, Power Automate Preferred experience: A/B testing, Python, Databricks (especially for manager role), ETL/ELT pipelines, data modelling (marts, dimensional models), Git/version control Experience in pricing analytics and Mastercard Test & Learn platform is a bonus Details Start date: ASAP Duration: Approximately 3 months, with More ❯
engineer or in a similar role, handling large datasets and complex data pipelines. Previous experience managing a team Experience with big data processing frameworks and technologies. Experience with data modelling and designing efficient data structures. Experience with data integration and ETL (Extract, Transform, Load) processes. Experience in data cleansing, validation, and enrichment processes. Strong programming skills in languages such … as Python, Java, or Scala. Knowledge of data warehousing concepts and dimensional modelling. Understanding of data security, privacy, and compliance requirements Proficiency in data integration and ETL tools Strong analytical skills and the ability to understand complex data structures. Capable of identifying data quality issues, troubleshooting problems, and implementing effective solutions. Disclosure and Barring Service Check This post is More ❯