London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
engineers and driving adoption of DevOps and CI/CD best practices within the data function Contribute to the evolution of a modern event-sourcing architecture, enabling efficient data modelling, streaming, and transformation across platforms Collaborate with cross-functional teams - including Business Analysts, Product Owners, and fellow Senior Data Engineers - to translate business needs into robust technical solutions Champion … Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing and CI/CD pipelines Advanced knowledge of SQL and data modelling (dimensionalmodelling, fact/dimension structures, slowly changing dimensions) Managing and querying data lakes or Lakehouse's Excellent communication skills with the ability to explain complex technical More ❯
and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensionalmodelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies … for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end delivery of enterprise BI/data warehouse solutions Reasonable Adjustments: Respect and equality are core values to us. We More ❯
particular focus on enhancing fan engagement through digital platforms. Key Responsibilities Design and develop ETL/ELT pipelines in Azure and Databricks, ensuring reliability and performance. Construct Kimball-style dimensional models to support analytics and reporting. Implement automated testing for data quality assurance and validation. Ensure compliance with data governance, legal, and regulatory standards . Collaborate with the wider More ❯