Core Responsibilities: 1. Data Engineering & Architecture Design and implement end-to-end data pipelines using Azure Data Factory, Synapse Analytics, and Databricks (or Microsoft Fabric where applicable). Develop ETL/ELT workflows to integrate andtransform data from diverse sources. Create and maintain data models to support reporting, dashboards, and analytical use cases. 2. Performance & Reliability Audit and enhance More ❯
Main duties of the job To deliver and maintain data transfer pipelines between clinical/operational source systems and the Christie Central Data Repository (CCDR). Ensure that all extract, transformandload (ETL) processes are robust, operate efficiently and fully documented. Learn and assist in the development of new tools, technologies and methods of working as they are identified … appropriate data warehousing design and software development methodologies agreed for use within the team. Produce detailed technical specifications and documentation for the support, maintenance, monitoring and quality assurances of ETL processes. Investigate, support, correct and prevent issues relating to existing Trust developed data reporting systems and processes, identifying issues and implementing resolutions in a timely manner. Participate in the sprint … and deployment of Business Intelligence functions in a large/complex organisation. Substantial experience of SQL Server database/data warehouse development maintenance and support. Substantial experience of handling ETL solutions and issues across a range of systems. Substantial experience of delivering complex data and reporting projects. Experience in task resource estimation for sprint planning. Experience in creating and updating More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
with the Agile. Good English listening and speaking for communicating requirements and development tasks/issues Hands-on experience with lakehouses, dataflows, pipelines, and semantic models Ability to build ETL workflows Familiarity with time-series data, market feeds, transactional records, and risk metrics Familiarity with Git, DevOps pipelines, and automated deployment Strong communication skills with a collaborative mindset to work More ❯
The goal is to ensure seamless data flow, accuracy, and scalability within the Salesforce environment. Key Responsibilities Design, develop, and maintain data pipelines using Databricks and Apache Spark. Implement ETL/ELT processes for structured and unstructured data. Optimise data workflows for performance and scalability. Collaborate with data scientists and analysts to enable advanced analytics and ML models. Ensure data More ❯
standards, models, and frameworks. Design data solutions leveraging Azure services such as Azure Data Lake, Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, and Azure Databricks. Data Integration & ETL Develop and optimize data pipelines for ingestion, transformation, and storage using Azure Data Factory and Databricks. Governance & Security Implement data governance, security, and compliance practices aligned with financial services regulations More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the data platform.* Work with orchestration tools such as Airflow, ADF, or More ❯
Java exposure beneficial Delta Lake/Delta table optimisation experience Git/GitLab, CI/CD pipelines, DevOps practices Strong troubleshooting and problem-solving ability Experience with lakehouse architectures, ETL workflows, and distributed computing Familiarity with time-series, market data, transactional data or risk metrics Nice to Have Power BI dataset preparation OneLake, Azure Data Lake, Kubernetes, Docker Knowledge of More ❯
on experience with Data Testing either with Python or Pyspark. Strong hands-on experience with Python Deep understanding of data test concepts, primarily inclined towards ETL. Hands-on with ETL/DWH testing and SQL (any RDBMS). Hands-on experience with any DB (preferably Oracle), including ability to read/understand/alter stored procedures. Strong experience in BDD More ❯
Familiarity with GitLab, unit testing, and CI/CD pipelines Strong troubleshooting ability and experience working in Agile environments Excellent communication skills with stakeholder-facing experience Practical experience building ETL workflows, lakehouse architectures, dataflows, and semantic models Exposure to time-series data, financial market feeds, transactional records, and risk-related datasets More ❯
. Significant experience with Qlik Sense and Power BI for dashboards and visualisations. Proven ability to develop insights for diverse audiences, including senior leadership. Strong understanding of data warehousing, ETL processes , and cloud platforms (AWS & Azure). Excellent communication skills to explain technical concepts to non-technical stakeholders. Interested? Apply now to join a team that's unlocking the true More ❯
Wellington, Shropshire, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Skills & Experience Strong expertise in SAS 9.4 (DI) and SAS Viya 3.x (SAS Studio, VA, VI). Familiarity with Platform LSF , Jira , and GIT . Hands-on experience with ETL tools: Pentaho , Talend . Data virtualization experience with Denodo . Proficiency in SQL and data modeling. Knowledge of Oracle (nice to have). Solid understanding of Agile/Scrum frameworks. More ❯
Telford, Shropshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Skills & Experience Strong expertise in SAS 9.4 (DI) and SAS Viya 3.x (SAS Studio, VA, VI). Familiarity with Platform LSF , Jira , and GIT . Hands-on experience with ETL tools: Pentaho , Talend . Data virtualization experience with Denodo . Proficiency in SQL and data modeling. Knowledge of Oracle (nice to have). Solid understanding of Agile/Scrum frameworks. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and loading data from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
caching. Run performance tests and ensure workloads are cost-efficient and reliable. Apply data quality, lineage, and access controls using Unity Catalog and governance best practices. Develop reusable PySpark ETLand data transformation code. Manage Delta Lake tables with ACID transactions, schema evolution, and time-travel capabilities. Integrate Databricks with Azure ADLS, Key Vault, Azure Functions, and related services. Work More ❯
Wellington, Shropshire, United Kingdom Hybrid/Remote Options
Experis
cross-functional teams to deliver robust BI solutions using Microsoft technologies. Technical Expertise : Extensive experience with Microsoft SQL Server, and related database technologies for data extraction, transformation, and loading (ETL). Solution Development : Skilled in designing, developing, and implementing business intelligence solutions, including dashboards and reports using Power BI and SSRS. Data Integration : Proficiency in integrating data from multiple sources More ❯
Ability to write Spark code for large-scale data processing, including RDDs, DataFrames, and Spark SQL. Hands-on experience with lakehouses, dataflows, pipelines, and semantic models. Ability to build ETL workflows. Familiarity with time-series data, market feeds, transactional records, and risk metrics. Familiarity with Git, DevOps pipelines, and automated deployment. Strong communication skills with a collaborative mindset to work More ❯
/R) and Spotfire APIs. Working knowledge of Power BI report development and differences between Spotfire and Power BI capabilities. Proficient in SQL, data integration (flat files, APIs, databases), ETL logic interpretation. Understanding of functional and visual parity considerations between BI tools. Strong analytical, debugging, communication skills to interface with stakeholders and migration engineers. The Role Act as the technical More ❯
Central London, London, England, United Kingdom Hybrid/Remote Options
E-Solutions IT Services UK Ltd
methodologies (Kimball, dimensional modelling, Data Vault 2.0) Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2 Experience in building robust and performant ETL processes Build and maintain Analysis Services databases and cubes (both multidimensional and tabular) Experience in using source control & ADO Understanding and experience of deployment pipelines Excellent analytical and problem-solving More ❯