London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
engineers and driving adoption of DevOps and CI/CD best practices within the data function Contribute to the evolution of a modern event-sourcing architecture, enabling efficient data modelling, streaming, and transformation across platforms Collaborate with cross-functional teams - including Business Analysts, Product Owners, and fellow Senior Data Engineers - to translate business needs into robust technical solutions Champion … Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing and CI/CD pipelines Advanced knowledge of SQL and data modelling (dimensionalmodelling, fact/dimension structures, slowly changing dimensions) Managing and querying data lakes or Lakehouse's Excellent communication skills with the ability to explain complex technical More ❯
Demonstrated experience in data engineering within the Azure cloud environment. Proficiency with Azure Data Factory, Synapse Analytics, Databricks, Azure SQL, and Azure Storage. Strong SQL skills and expertise in dimensionalmodelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing More ❯
Spotfire Lead Consultant (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and optimize data models for …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensionalmodelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies … for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end delivery of enterprise BI/data warehouse solutions Reasonable Adjustments: Respect and equality are core values to us. We More ❯
particular focus on enhancing fan engagement through digital platforms. Key Responsibilities Design and develop ETL/ELT pipelines in Azure and Databricks, ensuring reliability and performance. Construct Kimball-style dimensional models to support analytics and reporting. Implement automated testing for data quality assurance and validation. Ensure compliance with data governance, legal, and regulatory standards . Collaborate with the wider More ❯