London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
standards, models, and frameworks. Design data solutions leveraging Azure services such as Azure Data Lake, Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, and Azure Databricks. Data Integration & ETL Develop and optimize data pipelines for ingestion, transformation, and storage using Azure Data Factory and Databricks. Governance & Security Implement data governance, security, and compliance practices aligned with financial services regulations More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the data platform.* Work with orchestration tools such as Airflow, ADF, or More ❯
. Significant experience with Qlik Sense and Power BI for dashboards and visualisations. Proven ability to develop insights for diverse audiences, including senior leadership. Strong understanding of data warehousing, ETL processes , and cloud platforms (AWS & Azure). Excellent communication skills to explain technical concepts to non-technical stakeholders. Interested? Apply now to join a team that's unlocking the true More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and loading data from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Sanderson
a Fabric Data/BI Engineer to help create and drive analytics solutions. BI/Data Engineer, key skills: Microsoft Fabric experience Proven data engineering experience - setting up complex ETL processes with ADF pipelines Data visualisation and reporting using PowerBI Data modelling experience - conceptual, logical and physical Proficient in SQL Extensive experience with Microsoft cloud stack Data Engineer, BI Engineer More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
based data processing). Strong knowledge of Azure Data Platform (Data Lake, Synapse, etc.). Proficiency in Python and SQL for data engineering tasks. Understanding of data architecture andETL processes. Ability to work independently in a remote environment. Nice-to-Have Experience with CI/CD pipelines for data solutions. Familiarity with Delta Lake and ML pipelines . Start More ❯
to ensure cost-efficient, high-performing workloads. Implement data quality, lineage tracking, and access control policies aligned with Databricks Unity Catalogue and governance best practices. Develop PySpark applications for ETL, data transformation, and analytics, following modular and reusable design principles. Create and manage Delta Lake tables with ACID compliance, schema evolution, and time travel for versioned data management. Integrate Databricks More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Next Best Move
in Microsoft Office skills, including Outlook, Word, Excel and PowerPoint and other applications such as Microsoft 365, SharePoint, Teams and OneDrive. Power Apps and Power Automate experience. Experience in ETL tools, such as SSIS. Business Central experience. Good knowledge and Development experience of MS SQL. Experience in creating UAT scripts. Technical writing experience with the ability to present technical information More ❯
including designs, plans and end-user artefacts Desired Exposure to Local Government, public sector, or other regulated environments Understanding of governance, change control, and assurance processes Knowledge of SSIS, ETL processes, or wider Microsoft data/Power Platform tooling This assignment forms part of a wider modernisation programme aimed at replacing legacy systems with secure, maintainable environments. You will be More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Executive Facilities
in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) andETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and Generative AI experience More ❯
and maintain robust data pipelines, ensuring data integrity, scalability, and performance across multiple sources. Data Integration: Work with structured and unstructured data from various internal and external systems, applying ETL best practices. Collaboration: Partner with analysts, data scientists, and business teams to understand requirements and deliver high-quality solutions. Performance Optimisation: Monitor and tune Tableau Server performance, ensuring efficient query … compliance with Home Office security protocols. Essential Skills & Experience Proven experience in Tableau development (Desktop and Server) with strong visualisation and storytelling skills. Solid background in data engineering, including ETL processes and data pipeline development. Proficiency in SQL and experience with relational databases (e.g., Oracle, PostgreSQL, SQL Server). Familiarity with cloud platforms (AWS, Azure, or GCP) and data warehousing More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
ideal candidate will have a strong background in designing, building, and maintaining scalable data pipelines in the Azure cloud environment, with hands-on experience in modern data platforms andETL frameworks. Key Responsibilities: Design, develop, and maintain data pipelines and data integration solutions using Azure Data Factory, Synapse Analytics, and related services. Build and optimise data models to support reporting … hands-on experience as a Data Engineer working in Azure cloud environments. Strong proficiency in Azure Data Factory, Synapse Analytics, Databricks, and Azure SQL. Solid understanding of data modeling, ETL/ELT design, and data warehousing principles. Proficiency in SQL and Python for data transformation and automation. Experience working with version control (e.g., Git) and CI/CD for data More ❯
knowledge of its structure and capabilities SQL - knowledge/experience of writing scripts, queries, DML & DDL PL/SQL - coding experience for stand-alone scripts, procedures, packages and triggers ETL/Data Migration - experience of data migration projects and working with large data sets Data load Utilities - SQL*Loader plus use of Database Links Performance Tuning - code analysis and execution More ❯
Pay Rates: £500/d - £525/d Contract length: 6 months Base Location: London - 3/d a week Skills required: - Azure/Azure Data bricks - ADF/ETL - Pyspark/Scala Experience: Experience working on large data sets, and complex data pipelines. Understanding of Data Architecture and Design, and Data pipeline optimisation. Proven expertise with Data bricks, including More ❯
Employment Type: Contractor
Rate: £500 - £525 per day, Negotiable, Inc benefits
management skills Proficiency with Jira, DevOps, or similar tools for backlog management and refinement High-level understanding of data concepts, data architecture, and the importance of data integrity SQL & ETL Knowledge Strong organisational and planning capabilities with the ability to pragmatically prioritise work based on business value and interdependencies Experience delivering within an Agile environment #analyst #technicalba #dataanalyst #investmentmanagement #jira More ❯
develop, and manage scalable data solutions on the Microsoft Azure platform. You’ll play a key role in architecting and implementing robust data integration solutions, as well as building ETL pipelines to support data ingestion, transformation, and loading. The Role:As an Azure data engineer, you’ll collaborate with cross-functional teams in an agile environment — contributing to sprint planning … as you’ll often translate complex data concepts into clear insights for non-technical stakeholders. Key Responsibilities: Architect, design, and implement scalable Azure-based data solutions. Develop and maintain ETL processes for data ingestion, transformation, and loading. Ensure data governance, integrity, and quality throughout the data lifecycle. Implement robust data security, compliance, and privacy standards. Document data architectures, data flows … practices and modern engineering standards. Key Skills & Experience: Proven background as a Data Engineer or Data Architect working with Microsoft Azure. Strong expertise in data modelling, data warehousing, andETL development. Hands-on experience with Azure Data Factory, Azure Data Lake, and Azure SQL Database. Exposure to big data technologies such as Hadoop, Spark, and Databricks. Experience with Azure Synapse More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
EMR and S3 to support data ingestion, transformation and storage Work closely with data analysts, architects and business stakeholders to translate requirements into robust technical solutions Implement and optimise ETL/ELT processes , ensuring data integrity, consistency and quality across multiple sources Apply best practices in data modelling, version control, and CI/CD to deliver maintainable and reusable code … services Strong programming skills in Python (including libraries such as PySpark or Pandas) Solid understanding of data modelling, warehousing and architecture design within cloud environments Experience building and managing ETL/ELT workflows and data pipelines at scale Proficiency with SQL and working knowledge of relational and non-relational databases Experience deploying data infrastructure using IaC tools such as Terraform More ❯
Strong expertise with MSSQL Server (schema design, tuning, indexing, profiling) Advanced SQL and dimensional data modelling (SCDs, fact/dim, conformed dimensions) Experience with PostgreSQL optimisation. Advanced Python skills ETL/ELT Pipelines: Hands-on experience building pipelines using SSIS, dbt, Airflow, or similar Strong understanding of enterprise ETL frameworks, lineage, and data quality Cloud & Infrastructure: Experience designing and supporting … build, and maintain OLAP, Tabular, and Multidimensional models used across the business Develop semantic models and robust data structures for Power BI and Excel cube connectivity Create and optimise ETL/ELT pipelines integrating data from S3 and diverse source systems Administer and tune MSSQL Server and PostgreSQL for high performance and reliability Ensure model scalability, accuracy, consistency, and rapid More ❯
deliver high-quality solutions in a fast-paced environment. Key Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Databricks , and Synapse Analytics . Develop and optimise ETL processes for structured and semi-structured data. Work with SQL and Python for data transformation and modelling. Integrate data from multiple sources, ensuring accuracy, consistency, and performance. Collaborate with stakeholders … in enterprise environments. Strong hands-on expertise with Azure Data Factory , Databricks , Synapse , and Azure Data Lake . Proficiency in SQL , Python , and PySpark . Experience with data modelling , ETL optimisation , and cloud migration projects . Familiarity with Agile delivery and CI/CD pipelines. Excellent communication skills for working with technical and non-technical teams. Interested? Apply now or More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
as a skilled Salesforce Data Cloud specialist. Delivered two successful end-to-end Salesforce Data Cloud implementations. Strong expertise in designing scalable enterprise-level data architecture solutions. Experienced in ETL tools, data migration, and data cleansing practices. Proficient in writing and optimizing moderate to advanced SQL queries. Preferably a Salesforce Data Cloud Consultant certification holder. What to do next If More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
operations. Your expertise will be crucial as we gear up for an exciting data warehouse migration from New York to London in 2026! Key Responsibilities Analyse and optimise SSIS ETL pipelines and batch jobs. Improve SQL performance through effective indexing and execution plans. Identify and resolve locking and blocking issues to enhance efficiency. Apply best practises to boost overall warehouse … is essential. Solid experience with SSAS and SSRS tools. Deep understanding of execution plans and performance tuning techniques. Strong troubleshooting and problem-solving skills. Proven history of improving complex ETL environments. Nice to Have Experience with C# for SSIS scripts. Proficiency in Python. Exposure to Power BI for data visualisation. Why Join Us? Be part of a vibrant team that More ❯
support a major government programme delivering secure, scalable data solutions. Key Responsibilities Design and implement data pipelines on AWS using services such as Glue, Lambda, S3, and Redshift. Develop ETL processes and optimise data workflows for performance and security. Collaborate with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential … Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated environments. Excellent communication and problem-solving skills. Active SC clearance (mandatory). Desirable Experience with Terraform or CloudFormation. Exposure to CI More ❯