data storage systems, including databases, data warehouses, and data lakes, to efficiently handle large volumes of structured and unstructured data. Develop and maintain scalable ETL pipelines to extract, transform, andload data from various sources into the target data systems. Create and manage data schemas, data models, and data dictionaries to facilitate data governance and ensure data consistency and standardization. … Experience handling large datasets and complex data pipelines. Experience with big data processing frameworks and technologies. Experience with data modelling and designing efficient data structures. Experience with data integration andETL (Extract, Transform, Load) processes. Experience in data cleansing, validation, and enrichment processes. Strong programming skills in languages such as Python, Java, or Scala. Knowledge of data warehousing concepts and … dimensional modelling. Understanding of data security, privacy, and compliance requirements Proficiency in data integration andETL tools Strong analytical skills and the ability to understand complex data structures. Capable of identifying data quality issues, troubleshooting problems, and implementing effective solutions. Experience using Microsoft Azure Software Engineering specifically Python Disclosure and Barring Service Check This post is subject to the Rehabilitation More ❯
AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/ More ❯
work as part of a collaborative team to solve problems and assist other colleagues. - Ability to learn new technologies, programs and procedures. Technical Essentials: - Expertise across data warehouse andETL/ELT development in AWS preferred with experience in the following: - Strong experience in some of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation More ❯
Core Responsibilities: 1. Data Engineering & Architecture Design and implement end-to-end data pipelines using Azure Data Factory, Synapse Analytics, and Databricks (or Microsoft Fabric where applicable). Develop ETL/ELT workflows to integrate andtransform data from diverse sources. Create and maintain data models to support reporting, dashboards, and analytical use cases. 2. Performance & Reliability Audit and enhance More ❯
Main duties of the job To deliver and maintain data transfer pipelines between clinical/operational source systems and the Christie Central Data Repository (CCDR). Ensure that all extract, transformandload (ETL) processes are robust, operate efficiently and fully documented. Learn and assist in the development of new tools, technologies and methods of working as they are identified … appropriate data warehousing design and software development methodologies agreed for use within the team. Produce detailed technical specifications and documentation for the support, maintenance, monitoring and quality assurances of ETL processes. Investigate, support, correct and prevent issues relating to existing Trust developed data reporting systems and processes, identifying issues and implementing resolutions in a timely manner. Participate in the sprint … and deployment of Business Intelligence functions in a large/complex organisation. Substantial experience of SQL Server database/data warehouse development maintenance and support. Substantial experience of handling ETL solutions and issues across a range of systems. Substantial experience of delivering complex data and reporting projects. Experience in task resource estimation for sprint planning. Experience in creating and updating More ❯
Provide technical support and training to end-users on business intelligence tools. Identify opportunities to improve data processes and implement solutions. Integrate various data sources to provide comprehensive insights. ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources using SQL and other tools Maintain documentation for all analytics tools and processes. Profile …/Developer - SAP CO-PA/FI & Power BI should have: Expert user of Power BI. SQL, DAX and Databases - Ability to query databases and structure financial data efficiently ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources. Strong analytical and problem-solving skills. Experience in the industrial/manufacturing industry is More ❯
Provide technical support and training to end-users on business intelligence tools. Identify opportunities to improve data processes and implement solutions. Integrate various data sources to provide comprehensive insights. ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources using SQL and other tools Maintain documentation for all analytics tools and processes. Profile …/Developer - SAP CO-PA/FI & Power BI should have: Expert user of Power BI. SQL, DAX and Databases - Ability to query databases and structure financial data efficiently ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources. Strong analytical and problem-solving skills. Experience in the industrial/manufacturing industry is More ❯
Milton Keynes, Buckinghamshire, England, United Kingdom
Michael Page Technology
Provide technical support and training to end-users on business intelligence tools. Identify opportunities to improve data processes and implement solutions. Integrate various data sources to provide comprehensive insights. ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources using SQL and other tools Maintain documentation for all analytics tools and processes. Profile … PA/FI & Power BI should have: Proficiency in SAP CO-PA, FI, and Power BI. SQL, DAX and Databases - Ability to query databases and structure financial data efficiently ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources. Strong analytical and problem-solving skills. Experience in the industrial/manufacturing industry is More ❯
Provide technical support and training to end-users on business intelligence tools. Identify opportunities to improve data processes and implement solutions. Integrate various data sources to provide comprehensive insights. ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources using SQL and other tools Maintain documentation for all analytics tools and processes. Profile …/FI & Power BI should have: Proficiency in SAP SAC, CO-PA, FI, and Power BI. SQL, DAX and Databases - Ability to query databases and structure financial data efficiently ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources. Strong analytical and problem-solving skills. Experience in the industrial/manufacturing industry is More ❯
Milton Keynes, Buckinghamshire, England, United Kingdom
Michael Page Technology
Provide technical support and training to end-users on business intelligence tools. Identify opportunities to improve data processes and implement solutions. Integrate various data sources to provide comprehensive insights. ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources using SQL and other tools Maintain documentation for all analytics tools and processes. Profile …/FI & Power BI should have: Proficiency in SAP SAC, CO-PA, FI, and Power BI. SQL, DAX and Databases - Ability to query databases and structure financial data efficiently ETL Process - Knowledge of extract, Transform, Load (ETL) techniques for cleaning and consolidating data from different sources. Strong analytical and problem-solving skills. Experience in the industrial/manufacturing industry is More ❯
to define requirements and deliver solutions Requirements: Hands-on experience with Azure Data Factory, Databricks, SQL, Data Lake, Power BI Strong skills in T-SQL and DAX Experience in ETL development, data modelling, and data governance Familiarity with CI/CD pipelines and Azure DevOps Excellent communication and problem-solving skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
to define requirements and deliver solutions Requirements: Hands-on experience with Azure Data Factory, Databricks, SQL, Data Lake, Power BI Strong skills in T-SQL and DAX Experience in ETL development, data modelling, and data governance Familiarity with CI/CD pipelines and Azure DevOps Excellent communication and problem-solving skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
from the warehouse. Maintaining and improving the existing Python-based dynamic pricing model and web scraping tools. What we're looking for: Experience across a modern data stack for ETL/ELT processes and data warehousing. Strong SQL and Python skills, with an understanding of Kimball-style data modelling . Experience with DBT, Dagster, or Airflow for transformation and orchestration. More ❯
engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
lakehouse. Develop semantic models and dimensional structures optimised for BI, dashboarding, and machine learning. Ensure documentation, governance, and data consistency across domains. Collaborate with data engineers to support robust ETL/ELT pipelines and maintain end-to-end data lineage. Deploy analytics engineering solutions using dbt, SQL transformations, and CI/CD best practice. Partner with analysts and data scientists More ❯
Greater Manchester, Lancashire, England, United Kingdom
Sagacity
and reliable flow of data, building foundational skills in a supportive environment. Key Responsibilities Data Pipeline Development • Contribute to the design, building, and maintenance of scalable data pipelines • Implement ETL (Extract, Transform, Load) processes to ingest data from various sources • Ensure data quality, integrity, and reliability through thorough testing and validation Database Management • Help manage and optimise data storage solutions More ❯
Portsmouth, Hampshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, andETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for More ❯
with the Agile. Good English listening and speaking for communicating requirements and development tasks/issues Hands-on experience with lakehouses, dataflows, pipelines, and semantic models Ability to build ETL workflows Familiarity with time-series data, market feeds, transactional records, and risk metrics Familiarity with Git, DevOps pipelines, and automated deployment Strong communication skills with a collaborative mindset to work More ❯
suited to someone who's confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, andETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing More ❯
models in Snowflake to support analytics and reporting needs. Architecture Implementation: Apply defined data architecture standards to ingestion, transformation, storage, and optimisation processes. Pipeline Development: Develop robust ELT/ETL workflows using dbt and orchestration tools, ensuring reliability and maintainability. Performance & Cost Optimisation: Configure Snowflake warehouses and implement query optimisation techniques for efficiency. Data Quality & Governance: Apply data quality checks More ❯
and shape the direction of the platform as it evolves, pushing the boundaries of what's possible with data and AI. What You'll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI More ❯
The goal is to ensure seamless data flow, accuracy, and scalability within the Salesforce environment. Key Responsibilities Design, develop, and maintain data pipelines using Databricks and Apache Spark. Implement ETL/ELT processes for structured and unstructured data. Optimise data workflows for performance and scalability. Collaborate with data scientists and analysts to enable advanced analytics and ML models. Ensure data More ❯
this is your chance to make an impact. What You'll Work With Azure Data Services: Data Factory, Data Lake, SQL Databricks: Spark, Delta Lake Power BI: Advanced dashboards ETL & Data Modelling: T-SQL, metadata-driven pipelines DevOps: CI/CD Bonus: Python What you'll do Design and implement scalable Azure-based data solutions Build and optimize data pipelines More ❯