Role: Technical Data Architect Location: UK (remote is fine, candidate must be in UK and available for any IMP meeting) Contract: 3 months (will likelyextend) Data Feed Factory - for Telco Network Function Expectations on the role : Hands on Experience on design & architecture of the Data Platform, Technical components and Data Engg. related activities for supporting the … key capabilities Technical Data Architecture DataIngestion framework ( Batch/Micro Batch processing ) Data Contract & Data management( including DQ, metadata & lineage) Delivery, Scaling & Op Model Discovery/assessment along with the Data platform requirements Identify and document the relevant data sources to be used for collection & ingestion, cataloging, defining Data management. … This includes assessing data quality, accessibility, data contract and lineage including any other potential capability evaluation Analyze existing Data pipeline and data workflows to determine the integration points and any design considerations/framework and patterns for the data factory as per the need for TSA and Non-TSA batch business/technical requirements for More ❯
IT services company is hiring for a Contract Data Architect to work in Telford for 2 days per week. This is a 6-month Contract with a likely extension, paying between £540 - £570 per day InsideIR35. Due to the nature of the role, you will need to hold active SC Clearance. We are seeking an experienced Data Architect … to join a critical government project. This role supports the development of a robust data platform to combat fraud. You will be embedded in a newly formed Scrum team on the Minerva Platform, contributing to the ingestiondata within the SAS Platform, including IDP (Intelligent Data Platform) Responsibilities: - Design, develop, and maintain data architecture models … aligned with enterprise standards. - Support the ingestion and integration of data using the SAS platform. - Lead the upgrade, decommissioning, and archiving of data in accordance with data policies. - Collaborate with multidisciplinary teams to translate business problems into effective data solutions. - Maintain data dictionaries and metadata repositories to ensure information remains accurate and compliant. - Provide More ❯
IT services company is hiring for a Contract Data Architect to work in Telford for 2 days per week. This is a 6-month Contract with a likely extension, paying between 540 - 570 per day InsideIR35. Due to the nature of the role, you will need to hold active SC Clearance. We are seeking an experienced Data Architect … to join a critical government project. This role supports the development of a robust data platform to combat fraud. You will be embedded in a newly formed Scrum team on the Minerva Platform, contributing to the ingestiondata within the SAS Platform, including IDP (Intelligent Data Platform) Responsibilities: - Design, develop, and maintain data architecture models … aligned with enterprise standards. - Support the ingestion and integration of data using the SAS platform. - Lead the upgrade, decommissioning, and archiving of data in accordance with data policies. - Collaborate with multidisciplinary teams to translate business problems into effective data solutions. - Maintain data dictionaries and metadata repositories to ensure information remains accurate and compliant. - Provide More ❯
Data Engineer £500 - £560 per day London - 1 day per week in office We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role … You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing … DBT models and Airflow pipelines within a modern data stack. Building robust dataingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines for data workflows. Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents). More ❯
Join us as a Data Platform Engineer for our client. At Peregrine, we re always seeking Specialist Talent that have the ideal mix of skills, experience, and attitude, to place with our vast array of clients. From Business Analysts in large government organisations to Software Developers in the private sector we are always in search of the best talent … benefits, you will be deployed across our portfolio of clients as a specialist consultant, working on a wide array of complex projects across multiple industries. The Role: As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and onpremises platforms. … You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure dataMore ❯
Join us as a Data Platform Engineer for our client. At Peregrine, we’re always seeking Specialist Talent that have the ideal mix of skills, experience, and attitude, to place with our vast array of clients. From Business Analysts in large government organisations to Software Developers in the private sector – we are always in search of the best talent … benefits, you will be deployed across our portfolio of clients as a specialist consultant, working on a wide array of complex projects across multiple industries. The Role: As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and onpremises platforms. … You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure dataMore ❯
Test Engineer Data Platforms (FTC-6months Hybrid | London or Manchester) My client is looking for an experienced Test Engineer to take ownership of testing across modern data platforms. This is a hands-on role where you’ll design, implement, and execute test strategies for data pipelines, models, and reporting solutions built on Azure technologies. Key Responsibilities Define … and own the overall testing strategy for data pipelines, data models, and BI solutions. Develop and implement automated testing frameworks integrated with CI/CD pipelines. Design and execute test cases across dataingestion, transformation, storage, and presentation layers. Validate data accuracy, performance, and usability within Azure Databricks, Azure Data Factory, Azure Data Lake Storage, and Power BI. Establish and monitor data quality checks, reconciliation, and compliance processes. Collaborate closely with data engineers, BI developers, architects, and product owners to embed testing best practices across delivery. Essential Skills & Experience Proven experience as a Test Engineer within data engineering or analytics environments. Strong knowledge of Azure data services – Databricks More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half Technology are assisting a market leading financial services organisation to recruit a Data Engineer on a contract basis - Hybrid working - London based Looking for a highly skilled Data Engineer to join our growing data platform team. You will be responsible for designing, building, and optimising large-scale data pipelines and services that power analytics … reporting, and data-driven decision-making across the business. This role is heavily cloud-focused (AWS) and requires strong expertise in Python, Spark, and relational databases. Role Design, build, and maintain scalable, reliable, and high-performance data pipelines and workflows. Develop clean, maintainable, and testable code in Python for dataingestion, transformation, and processing. Optimise and … fine-tune query performance on Aurora Postgres and other relational databases. Architect and manage data solutions on AWS using serverless technologies such as Lambda, Glue, Glue Data Catalog, EMR serverless, and API Gateway. Implement and manage large-scale data processing with Spark (Iceberg tables in S3, Gold layer in Aurora Postgres). Collaborate with data scientists More ❯
seeking to recruit a DataOps Enginer on an initial 6 month contract based in London. It is hybrid and will require 2/3x days onsite per week. A Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizing biomedical and scientific data engineering, with demonstrable experience across … the following areas:?? They are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: - Building a next-generation, metadata- and automation-driven data experience for scientists, engineers, and decision-makers, increasing productivity and reducing time … spent on "data mechanics"- Providing best-in-class AI/ML and data analysis environments to accelerate our predictive capabilities and attract top-tier talent- Aggressively engineering our data at scale, as one unified asset, to unlock the value of our unique collection of data and predictions in real-timeAutomation of end-to-end dataMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
Palantir Foundry Data Engineer - DV Cleared NEW CONTRACT OPPORTUNITY FOR A PALANTIR FOUNDRY DATA ENGINEER TO WORK ON A NATIONAL SECURITY PROJECT IN LONDON WITH DV CLEARANCE - Contract role in London for a Palantir Foundry Data Engineer - Must hold DV Security Clearance - Central London based - Daily rate up to £800 - Hybrid position - To apply, email: or call … Who we are... We are seeking an experienced Palantir Foundry Data Engineer with current DV clearance to join a high-profile programme. This is a contract position offering hybrid working and a daily rate of up to £800 . In this role, you will be responsible for designing, developing, and optimising data pipelines and integrations within Palantir Foundry … ensuring data is efficiently processed, transformed, and made available for analysis and operational use. You will collaborate closely with analysts, data scientists, and business stakeholders to deliver robust, secure, and scalable data solutions. What we're looking for... Key Responsibilities: Develop and maintain data pipelines and workflows in Palantir Foundry Integrate diverse data sources, ensuring More ❯
Project Manager (Data Lakes projects) Our client is building a Data Lake in Azure needs an experienced Project Manager with a solid proven background in managing and running a databricks and dataingestion frameworks project. Must be experienced in dealing with difficult customers and hit the ground running. If you are an experience PM in running … large Data projects with Azure please get in touch for full details The role is full remote - YOU MUST LIVE IN THE UK as there may be the occasional site visits Contract until May 2026 could go on longer I will only reply to candidates who are Project Managers with databricks and dataingestion frameworks project experience More ❯
MI & Data Integration Analyst (Azure/Databricks) Hybrid - Central London | £43,000 pro rata | 3-month FTC (potential to extend) We're looking for a technically strong and commercially aware MI & Data Integration Analyst to join a leading UK retail and technology business on an initial 3-month fixed-term contract . This is a great opportunity for … someone who enjoys combining hands-on data analysis, reporting, and Azure integration work - helping build robust data foundations and self-serve reporting capability for an evolving business area. The Opportunity You'll be supporting a small, forward-thinking team to improve how they use data - helping them access accurate and timely MI, streamline processes, and migrate to … a new Azure-based data environment. You'll take ownership of reporting improvements, work closely with third-party partners on offshoring data processes, and support the move of key datasets into the Unity Catalog environment. Key Responsibilities Produce and enhance accurate, timely, and insightful MI reports to support business decision-making. Develop and maintain data tables and More ❯
Data Engineer £700/day outside IR35 3-month initial contract Remote working - occasional site visits Working with a leading financial services client who are looking for a Data Engineering Consultant. Looking for someone who can act independently and deliver complex data products - work with stakeholders to understand requirements, engineer and deliver products. Data Engineering Consultant … key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Dataingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working More ❯
Wokingham, Berkshire, South East, United Kingdom Hybrid / WFH Options
Stackstudio Digital Ltd
enable environmental impact assessments, economic modelling, and the integration of community preferences. The SME will be responsible for consolidating and migrating diverse spatial datasets into a unified system, ensuring data accuracy, quality, and accessibility, while supporting secure, role-based access, automation, and interoperability with internal and external systems. Your Responsibilities Platform Design & Implementation Architect and deploy a scalable ESRI … based GIS platform (ArcGIS Enterprise, ArcGIS Online, etc.). Define system architecture, data flows, and integration points with enterprise systems. Lead the migration and consolidation of spatial datasets from legacy systems. Spatial Analysis & Planning Develop spatial models for energy planning, environmental impact assessments, and economic forecasting. Integrate community feedback and preferences into spatial decision-making tools. Create interactive dashboards … and maps for stakeholder engagement. Data Management & Quality Assurance Establish data governance protocols for spatial data accuracy, completeness, and consistency. Implement ETL workflows using ArcGIS Data Interoperability tools or third-party solutions. Manage metadata standards and spatial data cataloging. Security & Access Control Configure role-based access controls and user permissions. Ensure compliance with dataMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
as a group Assisting with the maintenance and iteration of a FastAPI-based web API hosted on AWS. Enhancing the existing product in line with the roadmap, focusing on dataingestion, UI updates (with a Fullstack Developer), and potential agentic features. Supporting the design and delivery of a new capability converting audience data and insights into conversational … design, testing strategies, and technical direction, including Generative AI solutions involving RAG and prompt engineering. Maintaining documentation and Agile project processes. Assisting with new feature development in sentiment analysis, dataingestion, and synthetic audience creation. As a member of the Disability Confident Scheme, Circle and our Client guarantees to interview all candidates who have a disability and who More ❯
Our client a high profile deep-tech organisation, urgently require an experienced AI/Data Developer to undertake a contract assignment. In order to be successful, you will have the following experience: Extensive AI & Data Development background Experiences with Python (including data libraries such as Pandas, NumPy, and PySpark) and Apache Spark (PySpark preferred) Strong experience with … data management and processing pipelines Algorithm development and knowledge of graphs will be beneficial SC Clearance is essential Within this role, you will be responsible for: Supporting the development and delivery of AI solution to a Government customer Design, develop, and maintain data processing pipelines using Apache Spark Implement ETL/ELT workflows to extract, transform and load … large-scale datasets efficiently Develop and optimize Python-based applications for dataingestion Collaborate on development of machine learning models Ensure data quality, integrity, and performance across distributed environments Contribute to the design of data architectures, storage strategies, and processing frameworks Work with cloud data platforms (e.g., AWS, Azure, or GCP) to deploy scalable solutions More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Atrium Workforce Solutions Ltd
ELK SME Extension Professional experience in the design, maintenance and management of Elastic stacks (Elasticsearch, Logstash, Kibana) Experience of configuring and maintaining large Elastic clusters Experience working with large data sets and elastic indexing best practices. Good understanding on Visualisation components and techniques in Elasticsearch. Proven experience in performance management and tuning of Elasticsearch environment. Strong experience in writing … dataingestion pipelines using Logstash and other big data technologies. Please feel free to contact myself – Daisy Nguyen at Gibbs Consulting/Atrium UK for a confidential chat to know more details about the role. Please also note: Due to the volume of applications received for positions, it will not be possible to respond to all applications More ❯
site. Key Requirements Professional experience in the design, maintenance and management of Elastic stacks (Elasticsearch, Logstash, Kibana) Experience of configuring and maintaining large Elastic clusters Experience working with large data sets and elastic indexing best practices. Good understanding on Visualisation components and techniques in Elasticsearch. Proven experience in performance management and tuning of Elasticsearch environment. Strong experience in writing … dataingestion pipelines using Logstash and other big data technologies. Are you interested in this position? If so, then please respond with your CV and I will be in touch ASAP. More ❯
Work mode: Hybrid, 3 days working from client office Contract duration: Location: Birmingham, UK --------------------------------------- JOB DETAILS Role Title: Splunk SRE Engineer Responsible for leading and executing the migration of data, dashboards, alerts, and configurations from Splunk systems to Elasticsearch. This role involves deep technical expertise in Splunk architecture, dataingestion, and observability tools, along with strong project … communication skills -strong project management Responsibilities: Minimum number of relevant years of experience: 5 Detailed Job Description: -Ability to deploy and configure Elasticsearch, Logstash, Kibana for centralized logging/data analytics; setting up ELK clusters with high availability/optimization -Proficiency in containerization using Docker and orchestration with Kubernetes, ensuring effective management and scaling of containerized applications in all More ❯
I am working with a client in the education sector who are looking for a data engineer with experience across architect & strategy to join on a part-time 12 month contract.1-2 days per weekFully remoteOutside IR35Immediate start12 month contract Essential Been to school in the UK DataIngestion of APIs GCP based (Google Cloud Platform) Snowflake More ❯