operations. Your expertise will be crucial as we gear up for an exciting data warehouse migration from New York to London in 2026! Key Responsibilities Analyse and optimise SSIS ETL pipelines and batch jobs. Improve SQL performance through effective indexing and execution plans. Identify and resolve locking and blocking issues to enhance efficiency. Apply best practises to boost overall warehouse … is essential. Solid experience with SSAS and SSRS tools. Deep understanding of execution plans and performance tuning techniques. Strong troubleshooting and problem-solving skills. Proven history of improving complex ETL environments. Nice to Have Experience with C# for SSIS scripts. Proficiency in Python. Exposure to Power BI for data visualisation. Why Join Us? Be part of a vibrant team that More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
operations. Your expertise will be crucial as we gear up for an exciting data warehouse migration from New York to London in 2026! Key Responsibilities Analyse and optimise SSIS ETL pipelines and batch jobs. Improve SQL performance through effective indexing and execution plans. Identify and resolve locking and blocking issues to enhance efficiency. Apply best practises to boost overall warehouse … is essential. Solid experience with SSAS and SSRS tools. Deep understanding of execution plans and performance tuning techniques. Strong troubleshooting and problem-solving skills. Proven history of improving complex ETL environments. Nice to Have Experience with C# for SSIS scripts. Proficiency in Python. Exposure to Power BI for data visualisation. Why Join Us? Be part of a vibrant team that More ❯
London, England, United Kingdom Hybrid/Remote Options
Free-Work UK
data strategy, improve how teams access and use data, and ensure our platforms remain secure, resilient, and future-ready. design, build, and maintain scalable, high-quality data pipelines andETL/ELT workflows that support analytics, reporting, and product development. assess, recommend, and implement modern data technologies and tooling to meet organisational requirements. develop and optimise data models, warehouses, and … primary development language, and the ability to write clean, scalable, production-ready code then we want to hear from you. You should also have: demonstrable experience designing and building ETL/ELT data pipelines and integrating data from multiple upstream sources. proven cloud experience (Azure (preferred), AWS, or GCP), including deploying, managing, and supporting cloud-hosted data services and applications. … support them. We are seeking an experienced Data Engineer who will play a central role in building and supporting the new platform. The role will focus on developing the ETL data pipeline framework to integrate upstream master data from parliamentary business systems, as well as engineering new data services, including APIs and web-client applications. The Data Engineer will be More ❯
support a major government programme delivering secure, scalable data solutions. Key Responsibilities Design and implement data pipelines on AWS using services such as Glue, Lambda, S3, and Redshift. Develop ETL processes and optimise data workflows for performance and security. Collaborate with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential … Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated environments. Excellent communication and problem-solving skills. Active SC clearance (mandatory). Desirable Experience with Terraform or CloudFormation. Exposure to CI More ❯
support a major government programme delivering secure, scalable data solutions. Key Responsibilities Design and implement data pipelines on AWS using services such as Glue, Lambda, S3, and Redshift. Develop ETL processes and optimise data workflows for performance and security. Collaborate with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential … Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated environments. Excellent communication and problem-solving skills. Active SC clearance (mandatory). Desirable Experience with Terraform or CloudFormation. Exposure to CI More ❯
solutions while collaborating across teams to ensure reliable, secure, and efficient data infrastructure for the organisation. What you'll be doing: Design, develop, and maintain scalable data architectures andETL pipelines Build and manage data models and data warehouse solutions (Airflow, dbt, and Redshift) Write clean, efficient Python and SQL code for data processing and transformation Integrate data from internal … codebases Contribute to the continuous improvement of data processes and tooling across the organisation Experience required: Proven experience in data engineering and building scalable data solutions Strong experience with ETL processes, data modelling, and data warehousing Proficiency in Python and SQL Expertise in relational (SQL) and NoSQL database technologies Hands-on experience with AWS Solid understanding of data security, privacy More ❯
and turning data into decisions, this is the role for you. What you'll do Design and develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and … strong Azure expertise. Solid understanding of SQL, T-SQL, and performance tuning. Hands-on experience with Azure Synapse Analytics, Data Factory. Strong background in data modelling, dimensional design, andETL development. Proficiency in Power BI for data visualisation and reporting. Experience with Power Automate for process automation and workflow integration. Excellent problem-solving skills and the ability to work independently More ❯
designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering/ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services/Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to … to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For More ❯
designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering/ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services/Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to … to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For More ❯
Stockport, Cheshire, England, United Kingdom Hybrid/Remote Options
Robert Walters
insightful dashboards and visualisations (Power BI preferred, but any strong data visualisation experience considered). Use SQL to manage and analyse large datasets efficiently. Apply your strong understanding of ETL processes to improve data workflows and ensure reliable data pipelines. Perform robust data cleaning and preparation to support high-quality analytics. Tell compelling stories with data-translating complex insights into … and discovery. What We're Looking For Strong experience in data visualisation (Power BI ideal). Solid SQL skills and comfort working with large, complex datasets. Strong grasp of ETL concepts and data pipeline processes. Proficiency in data cleaning, transformation, and preparation. Excellent communication and data storytelling ability. Ability to collaborate with stakeholders and translate requirements into analytical solutions. Curiosity More ❯
Your work will be vital to ensuring data is accurate, structured, and available powering both real-time business intelligence and future advanced analytics. Key Responsibilities Design and develop automated ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas … Support groundwork for future data science and machine learning initiatives The successful applicant will be proficient in SQL and Python, with a proven track record of building and maintaining ETL/ELT pipelines. Experience working with Microsoft Fabric, Azure Data Factory, and modern Lakehouse or data warehouse architecture is essential. You’ll demonstrate a strong focus on data quality andMore ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
candidate will be well experienced in collaborating with cross-functional teams to deliver digital products. Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines to ingest, transform, andload data from diverse sources (APIs, databases, files) into Azure Databricks. Implement data cleaning, validation, and enrichment using Spark (PySpark/Scala) and related tools to ensure quality and consistency. More ❯
tools. Key Responsibilities Data Platform Management: o Utilize Teradata Vantage for data warehousing and advanced analytics. o Optimize queries and data structures for efficient risk data aggregation. Data Integration & ETL Development: o Design, develop, and optimize ETL workflows using Informatica PowerCenter and related tools. o Manage large-scale data integration projects across multiple platforms, ensuring high performance and scalability. o … and timeliness for regulatory reporting. o Support governance and lineage documentation for risk data flows. Required Skills & Experience Strong experience in Teradata and Informatica PowerCenter for data integration andETL development. Strong experience in writing and understanding the complex SQL queries and data warehousing concepts. Knowledge of BCBS239 principles and regulatory risk data aggregation requirements. Good if you have experience … with risk modelling tools (SAS, Python) andETL frameworks. Familiarity with data governance, lineage, and metadata management. Excellent problem-solving and communication skills. More ❯
Expertise in Pentaho Data Integration for ETL processes. Strong understanding of data warehousing concepts, data modelling, and schema design. Experience with other ETL tools (Informatica, Talend, SSIS) is a plus. Advanced SQL for querying and data manipulation. Familiarity with Java, Python, or VBA for scripting and automation. Knowledge of RESTful APIs and various database systems. Skills in data profiling, data … analysis, and ensuring data quality. Experience 612 years in Data Integration andETL development. Minimum 35 years of hands-on Pentaho experience. Leading complex ETL projects and working with cross-functional teams. Exposure to Banking/Financial Services domain is a plus. Preferred Qualifications Certifications in Pentaho, Big Data, or Cloud Platforms (AWS/GCP/Azure). Experience with More ❯
Managed Instance Azure Cosmos DB Azure Event Hub/Service Bus Azure Functions Azure Purview/Microsoft Fabric (desirable) Ensure best practices for data governance, lineage, quality, and cataloguing. ETL/ELT & Pipelines Architect and optimise ETL/ELT workflows using Data Factory, Databricks, or Synapse pipelines. Implement CI/CD for data pipelines using Azure DevOps or GitHub Actions. … Architect in large-scale Azure environments. Strong knowledge of data modelling, architecture principles, and design patterns. Hands-on experience with Azure data services and modern data platforms. Expertise in ETL/ELT processes and pipeline optimisation. Familiarity with CI/CD practices for data solutions. Strong understanding of data security, compliance, and governance frameworks. Desirable Experience with Microsoft Fabric andMore ❯
Bath, Avon, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
completed on a fully remote basis however occasional trips to the office in Bath is welcomed. Key Requirements Demonstrated expertise in Python programming, Data Science methodologies, and GIS Develop ETL pipelines of large-scale Raster, Vector and Tabular Data. Collaborate with developers, designers, analysts to build and improve applications. Research, design, test, and maintain software programs and system modifications. Ensure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Noir
designs. A background as a Business Analyst is preferred. Experience in workforce management, scheduling, HR tech, optimisation domains, AI/ML productisation, LLM integration, MLOps, or enterprise integration standards (ETL, REST APIs, webhooks, event streaming) is a bonus. At the centre of the company's culture is freedom and openness which takes a lot of people by surprise. But the More ❯
playbook” for future acquisitions Assess legacy repositories, analyse document structures and metadata, and map them to the enterprise taxonomy Plan and execute large-scale data extraction, transformation, and loading (ETL) activities while maintaining full data integrity and compliance Run test migrations, reconciliation checks, and quality assurance cycles Troubleshoot issues in real time and implement rapid fixes with minimal business disruption More ❯
reporting and data solutions that enhance efficiency, automate processes, and drive strategic decision-making. You will be responsible for gathering, cleaning, and integrating data from diverse sources, developing efficient Extract, Transform, Load (ETL) procedures, and utilizing BI tools such as Tableau and Power BI to create clear, impactful dashboards and reports. Advanced skills in Excel, Power Query, and VBA will More ❯
blueprints for hybrid and cloud-native solutions Essential leveraging AWS, Azure, or GCP Desirable Extensive exprience in developing architectural strategies, blueprints for hybrid and cloud-native solutions ELT/ETL Frameworks & Pipelines Essential Develop robust ELT/ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect. Desirable Optimize data transformations for performance, reusability More ❯
an award-winning business in a phase of growth, in a newly created role to lead a team and be hands on in replacing old tech systems and create ETL pipelines. We’re a small high-impact team but Jollyes is growing! Still small enough that you can launch a new product within days, but big enough to see the … your key responsibilities: Backend Development : Design, build, and maintain serverless and traditional backend architectures using best practices. (We build on AWS using Lambdas and ECS images – largely in NodeJS.) ETL Pipeline Management: Develop and optimise data pipelines to enable seamless data flow and transformation. (We currently use a mix of SSIS, ETL Works, Airflow, Snowflake and are moving to Airflow … solving and wants to grow in a fast-paced retail environment. Share our values of being: Wise, Focused, Genuine, Eager, Together Proficient in SQL and Python with Experience with ETL workflows Experience with cloud-based data environments (AWS: S3, Lambda, ECS) Bachelor’s degree or equivalent qualification or equivalent experience in Data Science, Computer Science, Statistics, or a related field. More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
proficiency in SQL for data querying, processing and performance tuning. Familiarity with Financial Services, preferably trading and the ability to ensure compliance throughout. Understanding of data warehousing, ELT/ETL processes, and data modelling. Experience using scheduling tools and in supporting applications running on Windows. A proactive mindset with a focus on service and customer impact. Performs consistently in time … to work different shift patterns/be on an on-call rota Desirable Skills Exposure to DevOps practices and tools (CI/CD, automation). Familiarity with Data/ETL tools such Microsoft Azure and Microsoft Fabric. Knowledge of Cloud-native development: Azure, Snowflake, ADF (Azure Data Factory) and DBT (Data Build Tool) for modular SQL development. Autosys scheduling and … Ab Initio for ETLand batch processing as well as Markit EDM. Understanding of MQ (IBM MQ, ACE) for message-based integrations or equivalent Understanding of Active Directory, Networks and Powershell Scripting or equivalent. We are proud to be a Disability Confident Committed employer. If you have a disability and would like to apply to one of our UK roles More ❯
Belfast, City of Belfast, County Antrim, United Kingdom Hybrid/Remote Options
Aspire Personnel Ltd
defence and security, energy and utilities, financial services, government and public services, health and life sciences, and transport. The Data Engineer will have experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. You will join the business at a period of huge growth. JOB DESCRIPTION Tech stack While the client … is a significant growth area for the business with a diverse and growing capability, and we are looking for a Data Engineer with experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. What More ❯
defence and security, energy and utilities, financial services, government and public services, health and life sciences, and transport. The Data Engineer will have experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. You will join the business at a period of huge growth. WE ALSO HAVE OPENINGS IN THE FOLLOWING … is a significant growth area for the business with a diverse and growing capability, and we are looking for a Data Engineer with experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. What More ❯
Preston, Lancashire, North West, United Kingdom Hybrid/Remote Options
Circle Group
Senior or Lead data engineer Experience handling large datasets, complex data pipelines, big data processing frameworks and technologies AWS or Azure cloud experience Experience with data modelling, data integration ETL processes and designing efficient data structures Strong programming skills in Python, Java, or Scala Data warehousing concepts and dimensional modelling experience Any data engineering skills in Azure Databricks and Microsoft … robust, scalable, and aligned data solutions for delivering high-quality care. The ideal candidate will lead the design and execution of cloud-based based scalable data storage solutions, oversee ETL pipeline development and optimisation, establish and manage data schemes and dictionaries, develop data integration solutions, lead data cleansing, validation, and enrichment processes. Duties include: Working closely with analysts and software … days per week in the office). To apply, press apply now or send your CV to Keywords: Lead Data Engineer/Azure Databricks/AWS/ETL/Python/data modelling Flexible working - Preston - Blackpool - Manchester - Warrington - Liverpool - Bolton - Blackburn Circle Recruitment is acting as an Employment Agency in relation to this vacancy. Earn yourself a referral bonus More ❯