Exp: 10 + yrs Job Summary The Senior Test Specialist/Architect in Cloud Automation Testing focuses on automating testing processes in ETL (Extract, Transform, Load) and Data Warehousing environments to ensure the quality and reliability of data pipelines. The role involves designing, developing, and implementing automation scripts for testing data transformations, data loading processes, and data quality verification. Key … Responsibilities 1. Develop and execute automated test scripts for etl processes and data pipelines. 2. Collaborate with cross functional teams to design and implement automated testing frameworks. 3. Create test plans and test cases for etland data warehouse testing. 4. Identify and troubleshoot issues in data transformations and data loading processes. 5. Conduct performance testing and ensure scalability of … data pipelines. 6. Implement best practices for data quality assurance in etl environments. Skill Requirements 1. Proficiency in cloud-based automation testing tools like selenium, appium, or similar. 2. Experience in automation testing of etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as More ❯
best practices. Participate in Agile delivery using Azure DevOps for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level More ❯
Azure Databricks (or Microsoft Fabric). Strong problem-solving skills and the ability to logically analyse complex requirements, processes, and systems to deliver solutions. Solid understanding of data modelling, ETL/ELT processes, and data warehousing. Proficiency in SQL and Python (especially PySpark), as well as other relevant programming languages. Passion for using data to drive key business decisions. Skills More ❯
in large-scale survey data. Integrating diverse data sources (APIs, databases, external datasets) into a unified analytics ecosystem . Automating data ingestion and transformation workflows using modern ELT/ETL best practices. Implementing monitoring and alerting systems to ensure high data quality and reliability. Mentoring a small team of data engineers, driving excellence and continuous learning. Partnering with Data Science More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Tenth Revolution Group
environments and enjoys working on varied, impactful projects. Key Responsibilities Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and SQL-based solutions. Develop and optimise ETL/ELT workflows to support analytics, reporting, and machine learning use cases. Work closely with clients to understand data requirements and translate them into robust technical solutions. Implement best practices More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Oscar
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
experience designing and implementing modern data architectures in cloud environments. Strong understanding of data modelling (conceptual, logical, and physical), including relational, dimensional, and NoSQL approaches. Expertise in data integration, ETL/ELT, and data pipeline design. Hands-on experience with data lakehouse, warehouse, and streaming data architectures. Working knowledge of SQL, Python, and relevant data engineering frameworks (e.g. Databricks, Synapse More ❯
error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
Banbury, Oxfordshire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
solutions. Develop and maintain data models, schemas, and documentation. Lead by example - setting standards for coding, design, and delivery in Databricks. Design, build, and maintain scalable data pipelines andETL processes across cloud platforms, databases, and APIs. Optimise data systems for performance, reliability, and scalability. Collaborate with the Data Architect to shape and deliver the data architecture roadmap. Maintain strong More ❯
Salesforce Administrator) are a plus. Preferred Skills Strong knowledge of integration patterns and authentication protocols. Knowledge of DevOps tools. Familiarity with the finance industry is a plus. Experience with ETL tools and data visualization platforms (e.g., Tableau, Power BI). Knowledge of programming languages (e.g., Python, Apex) for data manipulation and automation. Familiarity with cloud computing concepts and technologies. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Tenth Revolution Group
for scalable, reliable, and governed data solutions. Strong leadership and mentorship capabilities, guiding teams through complex deliveries, fostering collaboration, and ensuring adoption of best practices. Skilled in orchestrating complex ETL workflows, integrating hybrid cloud environments, and delivering high-quality data for advanced analytics and reporting. Experience with Power BI, and building dynamic dashboards to uncover actionable insights. Excellent communication andMore ❯
Power BI Data Analyst) Technical Skills Strong expertise in Power BI (Power Query, Power BI Desktop, Power BI Service, DAX, and M Query) Experience designing data models andETL processes within Power BI Proficiency in SQL for querying and transforming data from various sources Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Harrington Starr
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
to detail, a high degree of intellectual curiosity, and the ability to manage multiple priorities. Advanced Excel skills, including pivot tables, formulas, and data modeling. Hands-on experience with ETL tools (e.g., Alteryx, KNIME, Tableau Prep). Intermediate to advanced proficiency in data visualization platforms such as Tableau, Power BI, QlikView, or Domo. Preferred Qualifications Experience in sales operations or More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Montash
Build sophisticated reporting solutions and drive the adoption of generative AI to enable intuitive self-service analytics. Enhance data engineering processes through automation, CI/CD pipelines and optimized ETL/ELT workflows. Experiment with new tools, technologies and approaches to support the organisation’s analytics and insights strategy. Provide ongoing support for data products, including incident management and defining More ❯
JavaScript is a plus but not required. Experience implementing development best practices including writing automated testing and CI/CD deployment. Responsibilities : Build and maintain reliable data pipelines andETL processes for data ingestion and transformation. Support the development and maintenance of data models and data warehouses used for reporting and analytics. Collaborate with senior engineers, analysts, and product teams More ❯
next generation of their data platform. What you’ll be doing Designing, developing, and optimising scalable data pipelines that power advanced analytics and machine learning. Building and maintaining robust ETL/ELT workflows using Python, SQL, dbt, and cloud-native tools . Implementing data quality, observability, and governance frameworks to ensure reliability and compliance. Collaborating with data scientists, analysts, andMore ❯
office. What will you be doing? Deploy and manage ML models across dev to production at scale Build and maintain cloud-based data science environments Automate pipelines and services (ETL, storage, databases) Collaborate with data scientists and engineers Explore new tools to boost ML performance and reliability What are we looking for? Solid MLOps or ML Engineering experience Strong Python More ❯
years of experience in data engineering or a similar role Strong SQL skills and proficiency in at least one programming language (ideally Python) Understanding of data warehousing concepts andETL/ELT patterns Experience with version control (Git), testing, and code review practices Familiarity with cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Staffworx
/R) and Spotfire APIs. Working knowledge of Power BI report development and differences between Spotfire and Power BI capabilities. Proficient in SQL, data integration (flat files, APIs, databases), ETL logic interpretation. Understanding of functional and visual parity considerations between BI tools. Strong analytical, debugging, communication skills to interface with stakeholders and migration engineers. The Role Act as the technical More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hays
Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Insurance industry experience, with good understanding of what a claim is, claims process(es) etc Previous work on Data strategies More ❯