Exp: 10 + yrs Job Summary The Senior Test Specialist/Architect in Cloud Automation Testing focuses on automating testing processes in ETL (Extract, Transform, Load) and Data Warehousing environments to ensure the quality and reliability of data pipelines. The role involves designing, developing, and implementing automation scripts for testing data transformations, data loading processes, and data quality verification. Key … Responsibilities 1. Develop and execute automated test scripts for etl processes and data pipelines. 2. Collaborate with cross functional teams to design and implement automated testing frameworks. 3. Create test plans and test cases for etland data warehouse testing. 4. Identify and troubleshoot issues in data transformations and data loading processes. 5. Conduct performance testing and ensure scalability of … data pipelines. 6. Implement best practices for data quality assurance in etl environments. Skill Requirements 1. Proficiency in cloud-based automation testing tools like selenium, appium, or similar. 2. Experience in automation testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as More ❯
solutions within Microsoft Fabric (including Data Factory, Synapse, and OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical More ❯
best practices. Participate in Agile delivery using Azure DevOps for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Metropolitan Police
what we’d like you to bring: You’re confident with SQL - writing queries, building stored procedures, and designing databases is second nature to you. You’ve worked with ETL tools like Alteryx, SSIS, FME or Python, and you know how to move andtransform data efficiently. You’ve got experience building solutions in cloud platforms like Azure Data Factory More ❯
in large-scale survey data. Integrating diverse data sources (APIs, databases, external datasets) into a unified analytics ecosystem . Automating data ingestion and transformation workflows using modern ELT/ETL best practices. Implementing monitoring and alerting systems to ensure high data quality and reliability. Mentoring a small team of data engineers, driving excellence and continuous learning. Partnering with Data Science More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Tenth Revolution Group
environments and enjoys working on varied, impactful projects. Key Responsibilities Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and SQL-based solutions. Develop and optimise ETL/ELT workflows to support analytics, reporting, and machine learning use cases. Work closely with clients to understand data requirements and translate them into robust technical solutions. Implement best practices More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Oscar
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
compliance requirements. Team Management : Recruit, mentor, and develop a high-performing data team, promoting a culture of continuous learning and professional growth. Management of a hybrid team, comprising internal ETL specialists and third-party resources to ensure the Colt DCS Data Platform Data Governance : Establish and maintain data governance frameworks, policies, and standards to ensure data quality, security, and compliance More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
and shape the direction of the platform as it evolves, pushing the boundaries of what’s possible with data and AI. What You’ll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI More ❯
experience designing and implementing modern data architectures in cloud environments. Strong understanding of data modelling (conceptual, logical, and physical), including relational, dimensional, and NoSQL approaches. Expertise in data integration, ETL/ELT, and data pipeline design. Hands-on experience with data lakehouse, warehouse, and streaming data architectures. Working knowledge of SQL, Python, and relevant data engineering frameworks (e.g. Databricks, Synapse More ❯
error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
Banbury, Oxfordshire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
solutions. Develop and maintain data models, schemas, and documentation. Lead by example - setting standards for coding, design, and delivery in Databricks. Design, build, and maintain scalable data pipelines andETL processes across cloud platforms, databases, and APIs. Optimise data systems for performance, reliability, and scalability. Collaborate with the Data Architect to shape and deliver the data architecture roadmap. Maintain strong More ❯
clearly to stakeholders. Hard Skills/Technical Requirements: Must-Have: Microsoft PL-300 Certification Minimum 1 year of experience in Data Analytics, Data Engineering, or Power BI Knowledge of ETL processes within Microsoft Fabric Experience designing and implementing Dataflows and Dashboards in Power BI Service Strong understanding of REST API principles Advanced DAX skills Advanced Power Query skills Nice-to More ❯
Salesforce Administrator) are a plus. Preferred Skills Strong knowledge of integration patterns and authentication protocols. Knowledge of DevOps tools. Familiarity with the finance industry is a plus. Experience with ETL tools and data visualization platforms (e.g., Tableau, Power BI). Knowledge of programming languages (e.g., Python, Apex) for data manipulation and automation. Familiarity with cloud computing concepts and technologies. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Tenth Revolution Group
for scalable, reliable, and governed data solutions. Strong leadership and mentorship capabilities, guiding teams through complex deliveries, fostering collaboration, and ensuring adoption of best practices. Skilled in orchestrating complex ETL workflows, integrating hybrid cloud environments, and delivering high-quality data for advanced analytics and reporting. Experience with Power BI, and building dynamic dashboards to uncover actionable insights. Excellent communication andMore ❯
Power BI Data Analyst) Technical Skills Strong expertise in Power BI (Power Query, Power BI Desktop, Power BI Service, DAX, and M Query) Experience designing data models andETL processes within Power BI Proficiency in SQL for querying and transforming data from various sources Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Tata Consultancy Services
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Harrington Starr
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
slough, south east england, united kingdom Hybrid / WFH Options
Montash
Build sophisticated reporting solutions and drive the adoption of generative AI to enable intuitive self-service analytics. Enhance data engineering processes through automation, CI/CD pipelines and optimized ETL/ELT workflows. Experiment with new tools, technologies and approaches to support the organisation’s analytics and insights strategy. Provide ongoing support for data products, including incident management and defining More ❯
cross-functional teams to gather and understand business requirements, ensuring the design of scalable and effective data solutions. Optimize database performance and troubleshoot any issues related to data warehousing, ETLand reporting tools. Provide ongoing support for the data pipeline and reporting systems, ensuring high availability and performance. Continuously evaluate and implement best practices for data management, reporting, and analytics. More ❯
cross-functional teams to gather and understand business requirements, ensuring the design of scalable and effective data solutions. Optimize database performance and troubleshoot any issues related to data warehousing, ETLand reporting tools. Provide ongoing support for the data pipeline and reporting systems, ensuring high availability and performance. Continuously evaluate and implement best practices for data management, reporting, and analytics. More ❯
JavaScript is a plus but not required. Experience implementing development best practices including writing automated testing and CI/CD deployment. Responsibilities : Build and maintain reliable data pipelines andETL processes for data ingestion and transformation. Support the development and maintenance of data models and data warehouses used for reporting and analytics. Collaborate with senior engineers, analysts, and product teams More ❯