London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half have partnered with a market leading property development organisation for an experienced Azure Data Engineer to join their data and analytics team, on an initial 3-6 month contract based out of their London office The ideal candidate will have a strong background in designing, building, and maintaining scalable data pipelines in the Azure cloud … environment, with hands-on experience in modern data platforms and ETL frameworks. Key Responsibilities: Design, develop, and maintain data pipelines and data integration solutions using Azure Data Factory, SynapseAnalytics, and related services. Build and optimise data models to support reporting, analytics, and machine learning use cases. Implement robust data quality, governance, and security practices across … performance. Monitor and troubleshoot data workflows, ensuring timely delivery and high availability of data assets. Experience: 3-5 years of hands-on experience as a Data Engineer working in Azure cloud environments. Strong proficiency in Azure Data Factory, SynapseAnalytics, Databricks, and Azure SQL. Solid understanding of data modeling, ETL/ELT design, and data More ❯
East London, London, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
Your new company One of the largest independent bodies handling complaints regarding financial matters Your new role ETL/SQL/Azure Data Factory Test Lead - SDET - Outside IR35 What you'll need to succeed My client is looking for an interim Test Lead with an excellent SDET (Software Development Engineer in Test) background, to join in an immediately … initiatives Strong understanding of ETL processes - including data extraction, transformation, and loading Experience with database migration - validating and testing data movement across heterogeneous systems Hands-on testing experience with Azure Data Factory - including pipelines, triggers, and data flows Familiarity with AzureSynapseAnalytics (formerly Azure Enterprise Data Warehouse) - understanding of data warehousing concepts and testing More ❯
/Hybrid Role Purpose We are seeking a skilled Data Specialist to join our transformation programme, focusing on oneAdvanced. This role will leverage data science, machine learning, and advanced analytics to drive insights, automation, and optimisation across Finance, Supply Chain, HR, and Operations functions within oneAdvanced. You will be instrumental in building predictive models, data pipelines, and analytics … to Target. • Design, develop, and deploy predictive and prescriptive models (e.g., demand forecasting, anomaly detection, risk scoring, financial optimisation). • Build and manage end-to-end data pipelines using Azure Data Factory, Synapse, Dataverse, and Power BI. • Partner with oneAdvanced functional leads to enhance reporting, analytics, and automation capabilities. • Cleanse, transform, and integrate data from oneAdvanced, legacy … systems, and external data sources into central data platforms (EDW/Azure). • Implement machine learning use cases such as cashflow forecasting, working capital optimisation, fraud detection, and HR analytics. • Ensure compliance with GDPR, SOX, and internal governance when handling sensitive financial and HR data. • Support the development of a data-driven culture by mentoring analysts and promoting advanced More ❯
Engineer Location: 3 Days p/w Central London Rate: £450 Outside IR35 Duration: 6 Months + Extensions This role is responsible for migrating and developing modern data and analytics solutions using Microsoft Fabric and Azure technologies. The position supports enterprise-wide BI initiatives, delivering data models, pipelines, dashboards and reporting assets that enable data-driven decision-making. … Key Responsibilities Lead and implement data migration into Microsoft Fabric for analytics and reporting use cases Design and build data models from multiple sources to generate actionable insights Maintain and optimise data warehouse platforms; identify and resolve issues Develop data pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS Produce enterprise-grade Power BI dashboards and paginated … reports Translate business requirements into scalable technical BI solutions Write advanced SQL for Fabric Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation More ❯
the design and implementation of a next-generation data platform within a secure government environment. The role focuses on architecting, migrating, and optimising data platforms using Databricks on AWS (Azure experience also valued). You'll work closely with enterprise architecture, data engineering, and cloud infrastructure teams to design and deliver a scalable, compliant, and high-performing data ecosystem … that supports advanced analytics and AI workloads. Key Responsibilities Lead the architecture, design and implementation of Databricks-based data platforms in cloud environments (AWS or Azure). Define migration strategies from Legacy data environments to Databricks. Design data ingestion, transformation, and orchestration workflows to meet performance and security standards. Provide technical leadership on data lake, data warehouse, and … Contribute to architecture documentation, patterns, and reference models for reuse across the programme. Essential Skills & Experience Proven experience as a Data/Solution Architect specialising in Databricks (AWS preferred; Azure considered). Demonstrable experience in end-to-end implementation or migration to Databricks. Deep understanding of cloud-native data architectures , particularly in AWS (S3, Glue, EMR, Lambda) or AzureMore ❯
seeking a BI Fabric Developer/Engineer to help transition its data platforms into Microsoft Fabric workspace. This is a unique opportunity to work on cutting-edge BI and analytics solutions that drive strategic decision-making across the business. Key Responsibilities Lead the migration of data platforms into Microsoft Fabric for enhanced reporting and stakeholder access. Design and maintain … data pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS. Develop Power BI dashboards and paginated reports tailored to business needs. Model complex datasets into actionable insights using star schemas, snowflakes, and denormalised models. Optimise SQL queries and build semantic models within Fabric. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/… CD deployment processes using Azure DevOps. Required Skills & Experience Strong expertise in Power BI, DAX, SQL, and data architecture. Proven experience with Azure Data Factory, Fabric Lakehouse/Warehouse, and Synapse. Proficiency in PySpark, T-SQL, and data transformation using Notebooks. Familiarity with DevOps, ARM/Bicep templates, and semantic modelling. Experience delivering enterprise-scale BI/data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Certain Advantage
organisation on a key transformation programme within their trading and supply division. This is an exciting opportunity to play a pivotal role in building modern, scalable data solutions using Azure cloud technologies. The Role As a Senior Data Engineer, you’ll be responsible for designing and developing robust data foundations and end-to-end solutions that drive value across … data engineering best practices are consistently applied. Key Responsibilities Design and build data solutions aligned with business and IT strategy. Lead development of scalable data pipelines and models using Azure and Databricks. Support data foundation initiatives and ensure effective rollout across business units. Act as a bridge between technical and non-technical stakeholders, presenting insights clearly. Oversee change management … management, and data quality improvement. Contribute to best practice sharing and community-building initiatives within the data engineering space. Required Skills & Experience Cloud Platforms: Strong expertise in AWS/Azure/SAP ETL/ELT Pipelines: Advanced proficiency Data Modelling: Expert level Data Integration & Ingestion: Skilled Databricks, SQL, Synapse, Data Factory and related Azure services Version Control More ❯