target state from current DWH estate towards data products/marketplace model on AWS/Snowflake. Review AWS Infrastructure components design & usage and implement enhancements. Design and implement an ETL (Extract, Transform, Load) engine using AWS EMR (Elastic MapReduce) for efficient data processing. Design, review, and implement reporting solutions integrating Tableau with AWS services for seamless data visualization. Design and … tools, and practices. Troubleshoot and resolve infrastructure-related issues, providing technical support and guidance. Your Profile Essential Skills/Knowledge/Experience: Extensive AWS service knowledge Lambda Avaloq experience ETL (Extract, Transform, Load) Integrating Tableau with AWS services Amazon EKS (Elastic Kubernetes Service) Infrastructure as Code, scripting (Python/Bash), Helm charts, Docker, Kubernetes Tools like Terraform, Ansible, and Jenkins More ❯
be someone who can take end-to-end ownership of data engineering initiatives and deliver solutions aligned with enterprise data architecture as well as governance principles. Responsibilities Build complex ETL/ELT pipelines from scratch, integrating multiple data sources while implementing advanced transformations using mapping data flows Optimise performance and manage dependencies, ensuring robust monitoring/debugging of pipelines Architect More ❯
implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake More ❯
implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake More ❯
into the reporting layer, enabling analysts and stakeholders to uncover insight. Location: London Contract: 6 Months (OIR35) Rate: £325-400 p/day Key Responsibilities: Designing and maintaining scalable ETL/ELT pipelines . Integrating data from multiple sources into a centralised warehouse (SQL Server, PostgreSQL, or Snowflake). Working with Azure Data Factory and cloud-native tooling for data More ❯
City, Cardiff, United Kingdom Hybrid / WFH Options
VIQU IT
modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
VIQU IT
modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
Databricks Engineer London- hybrid- 3 days per week on-site 6 Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics More ❯
City of London, London, United Kingdom Hybrid / WFH Options
ECS
Databricks and native Azure services Requirements: 10+ years in cloud data engineering, with a strong focus on building scalable data pipelines Expertise in Azure Databricks, including building and managing ETL pipelines using PySpark or Scala Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure Data Lake Storage, Azure Data Factory, and Azure More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Synergize Consulting Ltd
skills in designing and deploying data-driven solutions based on SAS Viya, and SAS Data Integration software. - Management of dependencies utilising Platform LSF and Jira - Strong skills in using ETL tools such as Pentaho and Talend - Experience with SAS for data analysis purposes, and previous data visualisation experience would be desirable - Skills in DevOps would be beneficial, particularly related to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Certain Advantage
quality improvement. Contribute to best practice sharing and community-building initiatives within the data engineering space. Required Skills & Experience Cloud Platforms: Strong expertise in AWS/Azure/SAP ETL/ELT Pipelines: Advanced proficiency Data Modelling: Expert level Data Integration & Ingestion: Skilled Databricks, SQL, Synapse, Data Factory and related Azure services Version Control/DevOps tools: GITHUB, Azure DevOps More ❯
SSIS Developer London- Hybrid- 3 days on-site per week 6 months + UMBRELLA only- Inside IR35 Experienced SSIS Developer to design, develop, and maintain robust ETL processes using SQL Server Integration Services (SSIS) . The ideal candidate will work closely with business analysts … data architects, and stakeholders to ensure reliable data integration, transformation, and delivery across enterprise systems. Key Responsibilities Design, develop, test, and deploy ETL packages using SSIS to extract, transform, andload data from multiple sources into data warehouses, data marts, or operational systems. Optimize and troubleshoot existing SSIS packages for performance, scalability, and reliability. Develop and maintain SQL queries, stored … procedures, functions, and views to support ETLand reporting processes. Implement error handling, logging, and auditing within SSIS packages. Schedule and automate ETL jobs using SQL Server Agent or equivalent job schedulers. Collaborate with data architects and business analysts to understand business requirements and translate them into technical solutions. Perform data validation and quality checks to ensure integrity and accuracy More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualization using Denodo. Proficiency in SAS for data analytics and reporting. Oracle (good to have). Solid understanding of Agile and Scrum frameworks. Hands More ❯
Wellington, Shropshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualization using Denodo. Proficiency in SAS for data analytics and reporting. Oracle (good to have). Solid understanding of Agile and Scrum frameworks. Hands More ❯
ASAP. Key Requirements: Proven expertise in Data Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven More ❯
Employment Type: Contract, Work From Home
Rate: £600.0 - £800.0 per day + £market rate (Inside IR35)
DevOps team to deliver data solutions and maintain live systems. Apply best practices for data integration, transformation, and delivery in a secure and scalable environment. Support data pipelines andETL processes across various technologies and cloud platforms Maintain, monitor, and troubleshoot systems using a wide array of tools and platforms. Contribute to automation and continuous integration/deployment (CI/ More ❯
City of London, London, South Bank, United Kingdom
Experis
DevOps team to deliver data solutions and maintain live systems. Apply best practices for data integration, transformation, and delivery in a secure and scalable environment. Support data pipelines andETL processes across various technologies and cloud platforms Maintain, monitor, and troubleshoot systems using a wide array of tools and platforms. Contribute to automation and continuous integration/deployment (CI/ More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
ASAP. Key Requirements: Proven expertise in Data Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
Key Requirements: Proven expertise in Data Solution Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven More ❯
/R) and Spotfire APIs. Working knowledge of Power BI report development and differences between Spotfire and Power BI capabilities. Proficient in SQL, data integration (flat files, APIs, databases), ETL logic interpretation. Understanding of functional and visual parity considerations between BI tools. Strong analytical, debugging, communication skills to interface with stakeholders and migration engineers. The Role Act as the technical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Randstad Technologies Recruitment
Microsoft Azure Data Services (Fabric desirable). Knowledge of Data Lake, Power BI, SQL. Business Entity Mastering, Data Taxonomy and Data Dictionary. Knowledge of CI/CD Familiarity with ETL processes and best practices. Key Competencies Motivated to learn and grow in a dynamic environment. Strong organisational and technical skills with excellent attention to detail. Resilient, solution-focused, and able More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
Microsoft Azure Data Services (Fabric desirable). Knowledge of Data Lake, Power BI, SQL. Business Entity Mastering, Data Taxonomy and Data Dictionary. Knowledge of CI/CD Familiarity with ETL processes and best practices. Key Competencies Motivated to learn and grow in a dynamic environment. Strong organisational and technical skills with excellent attention to detail. Resilient, solution-focused, and able More ❯
to detail, a high degree of intellectual curiosity, and the ability to manage multiple priorities. Advanced Excel skills, including pivot tables, formulas, and data modeling. Hands-on experience with ETL tools (e.g., Alteryx, KNIME, Tableau Prep). Intermediate to advanced proficiency in data visualization platforms such as Tableau, Power BI, QlikView, or Domo. Preferred Qualifications Experience in sales operations or More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualization using Denodo. Proficiency in SAS for data analytics and reporting. Oracle (good to have). Solid understanding of Agile and Scrum frameworks. Hands More ❯