Exp: 10 + yrs Job Summary The Senior Test Specialist/Architect in Cloud Automation Testing focuses on automating testing processes in ETL (Extract, Transform, Load) and Data Warehousing environments to ensure the quality and reliability of data pipelines. The role involves designing, developing, and implementing automation scripts for testing data transformations, data loading processes, and data quality verification. Key … Responsibilities 1. Develop and execute automated test scripts for etl processes and data pipelines. 2. Collaborate with cross functional teams to design and implement automated testing frameworks. 3. Create test plans and test cases for etland data warehouse testing. 4. Identify and troubleshoot issues in data transformations and data loading processes. 5. Conduct performance testing and ensure scalability of … data pipelines. 6. Implement best practices for data quality assurance in etl environments. Skill Requirements 1. Proficiency in cloud-based automation testing tools like selenium, appium, or similar. 2. Experience in automation testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as More ❯
Exp: 10 + yrs Job Summary The Senior Test Specialist/Architect in Cloud Automation Testing focuses on automating testing processes in ETL (Extract, Transform, Load) and Data Warehousing environments to ensure the quality and reliability of data pipelines. The role involves designing, developing, and implementing automation scripts for testing data transformations, data loading processes, and data quality verification. Key … Responsibilities 1. Develop and execute automated test scripts for etl processes and data pipelines. 2. Collaborate with cross functional teams to design and implement automated testing frameworks. 3. Create test plans and test cases for etland data warehouse testing. 4. Identify and troubleshoot issues in data transformations and data loading processes. 5. Conduct performance testing and ensure scalability of … data pipelines. 6. Implement best practices for data quality assurance in etl environments. Skill Requirements 1. Proficiency in cloud-based automation testing tools like selenium, appium, or similar. 2. Experience in automation testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as More ❯
london (city of london), south east england, united kingdom
HCLTech
Exp: 10 + yrs Job Summary The Senior Test Specialist/Architect in Cloud Automation Testing focuses on automating testing processes in ETL (Extract, Transform, Load) and Data Warehousing environments to ensure the quality and reliability of data pipelines. The role involves designing, developing, and implementing automation scripts for testing data transformations, data loading processes, and data quality verification. Key … Responsibilities 1. Develop and execute automated test scripts for etl processes and data pipelines. 2. Collaborate with cross functional teams to design and implement automated testing frameworks. 3. Create test plans and test cases for etland data warehouse testing. 4. Identify and troubleshoot issues in data transformations and data loading processes. 5. Conduct performance testing and ensure scalability of … data pipelines. 6. Implement best practices for data quality assurance in etl environments. Skill Requirements 1. Proficiency in cloud-based automation testing tools like selenium, appium, or similar. 2. Experience in automation testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as More ❯
Responsibilities: Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric. Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best-practise ETL/ELT frameworks. Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms. Develop and optimise SQL data models and support the creation of More ❯
Sutton Coldfield, Birmingham, West Midlands (County), United Kingdom
SF Recruitment
Responsibilities: Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric. Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best-practise ETL/ELT frameworks. Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms. Develop and optimise SQL data models and support the creation of More ❯
work as part of a collaborative team to solve problems and assist other colleagues. - Ability to learn new technologies, programs and procedures. Technical Essentials: - Expertise across data warehouse andETL/ELT development in AWS preferred with experience in the following: - Strong experience in some of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation More ❯
in large-scale survey data. Integrating diverse data sources (APIs, databases, external datasets) into a unified analytics ecosystem . Automating data ingestion and transformation workflows using modern ELT/ETL best practices. Implementing monitoring and alerting systems to ensure high data quality and reliability. Mentoring a small team of data engineers, driving excellence and continuous learning. Partnering with Data Science More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Oscar
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Oscar
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Oscar Technology
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
tangible difference, we want to hear from you! Please note, this is a fixed term contract for a period of 12 months. What you will be doing. Design and develop Extract, TransformandLoad (ETL) pipeline processes to automate the access to source data, quality check and increase the usage. Asist with the development of new data analysis and reporting … Factory Experience using GIS in a commercial or public sector environment and building mapping-based data analysis solutions Programming and scripting experience (e.g. FME, Python, DAX, JavaScript etc) to extract, transformand manage data Experience of data capture, analysis and management of complex datasets across a variety of applications from a variety of sources (APIs etc.) Experience of collecting andMore ❯
documentation Skills To Create Thrills Strong SQL skills, able to write complex and performant queries with ease. Solid experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with Apache Airflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling More ❯
error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
City, Cardiff, United Kingdom Hybrid / WFH Options
VIQU IT
modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
VIQU IT
modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
You'll Be Doing Delivering full lifecycle data solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards andMore ❯
Leicester, Leicestershire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
You'll Be Doing Delivering full lifecycle data solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards andMore ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
You'll Be Doing Delivering full lifecycle data solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards andMore ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
You'll Be Doing Delivering full lifecycle data solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards andMore ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
You'll Be Doing Delivering full lifecycle data solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards andMore ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
Reed
staff. Required Skills & Qualifications: Experience designing cloud data platforms in Azure/AWS or significant on-premise design experience. 5+ years in data engineering or business intelligence roles. Extensive ETLand data pipeline design experience, technology agnostic. Proficiency in SQL and experience with data engineering coding languages such as Python, R, or Spark. Understanding of data warehouse and data lake More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
management position. Financial services experience is essential , ideally covering lending, servicing, or securitisation data. Deep technical expertise in Microsoft and Azure data technologies. Strong knowledge of data modelling (Kimball) , ETL/ELT , and hybrid cloud architectures . Proven ability to drive quality, governance, and best practices within engineering teams. Excellent communication, stakeholder management, and leadership skills. If you're interested More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
end-to-end data solutions, including acquisition, engineering, modelling, analysis, and visualisation. Lead client workshops to gather requirements and communicate effectively across technical and business stakeholders. Design and implement ETL/ELT pipelines using Microsoft Fabric, Azure Synapse, or Databricks - experience with at least one of these platforms is essential. Build scalable data lake solutions using medallion architecture principles. Migrate More ❯
with analytics, product, and engineering teams to support advanced analytics and machine learning initiatives. The skills you'll need: Extensive experience designing and building large-scale data pipelines andETL processes. Strong proficiency in SQL and Python. Deep understanding of data modelling, warehousing, and performance optimisation. Proven experience with cloud platforms (AWS, Azure, or GCP) and their data services. Hands More ❯