of a modern Data platform solution to handle large and complex data sets. Key tasks: Design Data Lake and Data Warehouse solutions Design Data Models using Data Vault and Dimensionalmodelling methods Implement automated, reusable and efficient batch data pipelines and streaming data pipelines Work closely with Governance and Quality teams to ensure metadata, catalogue, lineage and known More ❯
orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding of data warehousing concepts, dimensional modeling, and ELT principles. Familiarity with data quality, governance, and security best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in a fast More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products end-to-end, from … design to continuous improvement Promote innovation and best practices in data engineering About You: Strong experience with SQL, Python, and BI tools (e.g., Power BI) Solid understanding of dimensionalmodelling and data architecture Experience working in governed, decentralised data environments Excellent communication and stakeholder engagement skills Analytical mindset with a focus on delivering business value If you are More ❯