Role: Technical Data Architect Location: UK (remote is fine, candidate must be in UK and available for any IMP meeting) Contract: 3 months (will likelyextend) Data Feed Factory - for Telco Network Function Expectations on the role : Hands on Experience on design & architecture of the Data Platform, Technical components and Data Engg. related activities for supporting the … key capabilities Technical Data Architecture DataIngestion framework ( Batch/Micro Batch processing ) Data Contract & Data management( including DQ, metadata & lineage) Delivery, Scaling & Op Model Discovery/assessment along with the Data platform requirements Identify and document the relevant data sources to be used for collection & ingestion, cataloging, defining Data management. … This includes assessing data quality, accessibility, data contract and lineage including any other potential capability evaluation Analyze existing Data pipeline and data workflows to determine the integration points and any design considerations/framework and patterns for the data factory as per the need for TSA and Non-TSA batch business/technical requirements for More ❯
IT services company is hiring for a Contract Data Architect to work in Telford for 2 days per week. This is a 6-month Contract with a likely extension, paying between £540 - £570 per day InsideIR35. Due to the nature of the role, you will need to hold active SC Clearance. We are seeking an experienced Data Architect … to join a critical government project. This role supports the development of a robust data platform to combat fraud. You will be embedded in a newly formed Scrum team on the Minerva Platform, contributing to the ingestiondata within the SAS Platform, including IDP (Intelligent Data Platform) Responsibilities: - Design, develop, and maintain data architecture models … aligned with enterprise standards. - Support the ingestion and integration of data using the SAS platform. - Lead the upgrade, decommissioning, and archiving of data in accordance with data policies. - Collaborate with multidisciplinary teams to translate business problems into effective data solutions. - Maintain data dictionaries and metadata repositories to ensure information remains accurate and compliant. - Provide More ❯
IT services company is hiring for a Contract Data Architect to work in Telford for 2 days per week. This is a 6-month Contract with a likely extension, paying between 540 - 570 per day InsideIR35. Due to the nature of the role, you will need to hold active SC Clearance. We are seeking an experienced Data Architect … to join a critical government project. This role supports the development of a robust data platform to combat fraud. You will be embedded in a newly formed Scrum team on the Minerva Platform, contributing to the ingestiondata within the SAS Platform, including IDP (Intelligent Data Platform) Responsibilities: - Design, develop, and maintain data architecture models … aligned with enterprise standards. - Support the ingestion and integration of data using the SAS platform. - Lead the upgrade, decommissioning, and archiving of data in accordance with data policies. - Collaborate with multidisciplinary teams to translate business problems into effective data solutions. - Maintain data dictionaries and metadata repositories to ensure information remains accurate and compliant. - Provide More ❯
Job Title: Data Quality Analyst Location: London (2 days per week on-site at Liverpool Street) Contract: 6 months (via umbrella) Rate: Competitive Are you passionate about driving data accuracy and integrity in a fast-paced financial services environment? This is a fantastic opportunity to join a leading international bank at the forefront of data-led transformation. … You'll be part of a growing Data Office that is shaping strategy, governance, and innovation across EMEA - making a real impact from day one. The Role As a Data Quality Analyst, you will play a pivotal role in ensuring the accuracy, completeness, and integrity of data across AML and Sanctions screening platforms. You will work closely … with Financial Crime, Data Governance, Technology, and Risk teams to design and embed effective data quality controls, strengthen governance, and support regulatory compliance. This role offers the chance to directly contribute to critical transformation programmes within financial crime compliance. Key Responsibilities Design, build, and monitor Data Quality Rules within Collibra Data Quality (CDQ) for AML and More ❯
Data Engineer £500 - £560 per day London - 1 day per week in office We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role … You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing … DBT models and Airflow pipelines within a modern data stack. Building robust dataingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines for data workflows. Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents). More ❯
Join us as a Data Platform Engineer for our client. At Peregrine, we re always seeking Specialist Talent that have the ideal mix of skills, experience, and attitude, to place with our vast array of clients. From Business Analysts in large government organisations to Software Developers in the private sector we are always in search of the best talent … benefits, you will be deployed across our portfolio of clients as a specialist consultant, working on a wide array of complex projects across multiple industries. The Role: As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and onpremises platforms. … You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure dataMore ❯
Join us as a Data Platform Engineer for our client. At Peregrine, we’re always seeking Specialist Talent that have the ideal mix of skills, experience, and attitude, to place with our vast array of clients. From Business Analysts in large government organisations to Software Developers in the private sector – we are always in search of the best talent … benefits, you will be deployed across our portfolio of clients as a specialist consultant, working on a wide array of complex projects across multiple industries. The Role: As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and onpremises platforms. … You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure dataMore ❯
Test Engineer Data Platforms (FTC-6months Hybrid | London or Manchester) My client is looking for an experienced Test Engineer to take ownership of testing across modern data platforms. This is a hands-on role where you’ll design, implement, and execute test strategies for data pipelines, models, and reporting solutions built on Azure technologies. Key Responsibilities Define … and own the overall testing strategy for data pipelines, data models, and BI solutions. Develop and implement automated testing frameworks integrated with CI/CD pipelines. Design and execute test cases across dataingestion, transformation, storage, and presentation layers. Validate data accuracy, performance, and usability within Azure Databricks, Azure Data Factory, Azure Data Lake Storage, and Power BI. Establish and monitor data quality checks, reconciliation, and compliance processes. Collaborate closely with data engineers, BI developers, architects, and product owners to embed testing best practices across delivery. Essential Skills & Experience Proven experience as a Test Engineer within data engineering or analytics environments. Strong knowledge of Azure data services – Databricks More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half Technology are assisting a market leading financial services organisation to recruit a Data Engineer on a contract basis - Hybrid working - London based Looking for a highly skilled Data Engineer to join our growing data platform team. You will be responsible for designing, building, and optimising large-scale data pipelines and services that power analytics … reporting, and data-driven decision-making across the business. This role is heavily cloud-focused (AWS) and requires strong expertise in Python, Spark, and relational databases. Role Design, build, and maintain scalable, reliable, and high-performance data pipelines and workflows. Develop clean, maintainable, and testable code in Python for dataingestion, transformation, and processing. Optimise and … fine-tune query performance on Aurora Postgres and other relational databases. Architect and manage data solutions on AWS using serverless technologies such as Lambda, Glue, Glue Data Catalog, EMR serverless, and API Gateway. Implement and manage large-scale data processing with Spark (Iceberg tables in S3, Gold layer in Aurora Postgres). Collaborate with data scientists More ❯
seeking to recruit a DataOps Enginer on an initial 6 month contract based in London. It is hybrid and will require 2/3x days onsite per week. A Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizing biomedical and scientific data engineering, with demonstrable experience across … the following areas:?? They are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: - Building a next-generation, metadata- and automation-driven data experience for scientists, engineers, and decision-makers, increasing productivity and reducing time … spent on "data mechanics"- Providing best-in-class AI/ML and data analysis environments to accelerate our predictive capabilities and attract top-tier talent- Aggressively engineering our data at scale, as one unified asset, to unlock the value of our unique collection of data and predictions in real-timeAutomation of end-to-end dataMore ❯
seeking to recruit a DataOps Enginer on an initial 6 month contract based in London. It is hybrid and will require 2/3x days onsite per week. A Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizing biomedical and scientific data engineering, with demonstrable experience across … the following areas: They are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: - Building a next-generation, metadata- and automation-driven data experience for scientists, engineers, and decision-makers, increasing productivity and reducing time … spent on "data mechanics" - Providing best-in-class AI/ML and data analysis environments to accelerate our predictive capabilities and attract top-tier talent - Aggressively engineering our data at scale, as one unified asset, to unlock the value of our unique collection of data and predictions in real-time Automation of end-to-end dataMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
Palantir Foundry Data Engineer - DV Cleared NEW CONTRACT OPPORTUNITY FOR A PALANTIR FOUNDRY DATA ENGINEER TO WORK ON A NATIONAL SECURITY PROJECT IN LONDON WITH DV CLEARANCE - Contract role in London for a Palantir Foundry Data Engineer - Must hold DV Security Clearance - Central London based - Daily rate up to £800 - Hybrid position - To apply, email: or call … Who we are... We are seeking an experienced Palantir Foundry Data Engineer with current DV clearance to join a high-profile programme. This is a contract position offering hybrid working and a daily rate of up to £800 . In this role, you will be responsible for designing, developing, and optimising data pipelines and integrations within Palantir Foundry … ensuring data is efficiently processed, transformed, and made available for analysis and operational use. You will collaborate closely with analysts, data scientists, and business stakeholders to deliver robust, secure, and scalable data solutions. What we're looking for... Key Responsibilities: Develop and maintain data pipelines and workflows in Palantir Foundry Integrate diverse data sources, ensuring More ❯
Project Manager (Data Lakes projects) Our client is building a Data Lake in Azure needs an experienced Project Manager with a solid proven background in managing and running a databricks and dataingestion frameworks project. Must be experienced in dealing with difficult customers and hit the ground running. If you are an experience PM in running … large Data projects with Azure please get in touch for full details The role is full remote - YOU MUST LIVE IN THE UK as there may be the occasional site visits Contract until May 2026 could go on longer I will only reply to candidates who are Project Managers with databricks and dataingestion frameworks project experience More ❯
MI & Data Integration Analyst (Azure/Databricks) Hybrid - Central London | £43,000 pro rata | 3-month FTC (potential to extend) We're looking for a technically strong and commercially aware MI & Data Integration Analyst to join a leading UK retail and technology business on an initial 3-month fixed-term contract . This is a great opportunity for … someone who enjoys combining hands-on data analysis, reporting, and Azure integration work - helping build robust data foundations and self-serve reporting capability for an evolving business area. The Opportunity You'll be supporting a small, forward-thinking team to improve how they use data - helping them access accurate and timely MI, streamline processes, and migrate to … a new Azure-based data environment. You'll take ownership of reporting improvements, work closely with third-party partners on offshoring data processes, and support the move of key datasets into the Unity Catalog environment. Key Responsibilities Produce and enhance accurate, timely, and insightful MI reports to support business decision-making. Develop and maintain data tables and More ❯
join our MarTech team at Aviva. In this role, youll take full ownership of building, managing, and optimising audiences within Adobe Experience Platform (AEP) using the real-time Customer Data Platform (rtCDP). Your work will enable personalised, data-driven experiences across paid media, website, app, and email channelsensuring the right message reaches the right customer at the … right time. This is a high-impact role suited to someone passionate about data-driven marketing and audience strategy. If you thrive in a fast-paced environment and enjoy blending technical capability with strategic thinking, wed love to hear from you. A bit about the job: This is a hands-on role where technical expertise meets marketing activation. Youll … digital channels. Collaboration will be key, as youll work closely with product and platform owners, go-to-market managers, and channel specialists to maximise the value of Avivas customer data and Adobe MarTech ecosystem. Youll be instrumental in managing customer profiles, identities, and the identity graph to maintain unified customer views. This includes overseeing dataingestion pipelinesEdge More ❯
month contract (3 days per week/24 hours per week) Company Introduction: We are currently recruiting an experienced Microsoft Power Platform Application Designer/Developer for a global Data Analytic client in London. This is a 3 day per week role - 24 working hours per week. Required Skills/Experience: Proven expertise with Power Automate, Power Apps, Power … BI, and Copilot Studio. Experience creating and maintaining Fabric data agents and integrating them into Power Platform and Copilot solutions. Power Apps : Experience developing complex Canvas Apps with multiple data connections. Proficiency in advanced expressions and formulas. Understanding of Model-Driven Apps and Dataverse schema design. Power Automate: Experience building large, multi-branch flows with complex logic and … varied data connectors. Skilled in error handling, performance optimisation, and flow governance. Strong SQL skills for data modelling, querying, and optimisation within Fabric and Power BI. Knowledge of Python for data manipulation, API automation, and AI model integration. Job Responsibilities/Objectives: Develop solutions using Power Automate, Power Apps, Power BI, and Copilot Studio. Create and manage More ❯
Data Engineer £700/day outside IR35 3-month initial contract Remote working - occasional site visits Working with a leading financial services client who are looking for a Data Engineering Consultant. Looking for someone who can act independently and deliver complex data products - work with stakeholders to understand requirements, engineer and deliver products. Data Engineering Consultant … key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Dataingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working More ❯
Wokingham, Berkshire, South East, United Kingdom Hybrid / WFH Options
Stackstudio Digital Ltd
enable environmental impact assessments, economic modelling, and the integration of community preferences. The SME will be responsible for consolidating and migrating diverse spatial datasets into a unified system, ensuring data accuracy, quality, and accessibility, while supporting secure, role-based access, automation, and interoperability with internal and external systems. Your Responsibilities Platform Design & Implementation Architect and deploy a scalable ESRI … based GIS platform (ArcGIS Enterprise, ArcGIS Online, etc.). Define system architecture, data flows, and integration points with enterprise systems. Lead the migration and consolidation of spatial datasets from legacy systems. Spatial Analysis & Planning Develop spatial models for energy planning, environmental impact assessments, and economic forecasting. Integrate community feedback and preferences into spatial decision-making tools. Create interactive dashboards … and maps for stakeholder engagement. Data Management & Quality Assurance Establish data governance protocols for spatial data accuracy, completeness, and consistency. Implement ETL workflows using ArcGIS Data Interoperability tools or third-party solutions. Manage metadata standards and spatial data cataloging. Security & Access Control Configure role-based access controls and user permissions. Ensure compliance with dataMore ❯
as a group Assisting with the maintenance and iteration of a FastAPI-based web API hosted on AWS. Enhancing the existing product in line with the roadmap, focusing on dataingestion, UI updates (with a Fullstack Developer), and potential agentic features. Supporting the design and delivery of a new capability converting audience data and insights into conversational … design, testing strategies, and technical direction, including Generative AI solutions involving RAG and prompt engineering. Maintaining documentation and Agile project processes. Assisting with new feature development in sentiment analysis, dataingestion, and synthetic audience creation. As a member of the Disability Confident Scheme, Circle and our Client guarantees to interview all candidates who have a disability and who More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
as a group Assisting with the maintenance and iteration of a FastAPI-based web API hosted on AWS. Enhancing the existing product in line with the roadmap, focusing on dataingestion, UI updates (with a Fullstack Developer), and potential agentic features. Supporting the design and delivery of a new capability converting audience data and insights into conversational … design, testing strategies, and technical direction, including Generative AI solutions involving RAG and prompt engineering. Maintaining documentation and Agile project processes. Assisting with new feature development in sentiment analysis, dataingestion, and synthetic audience creation. As a member of the Disability Confident Scheme, Circle and our Client guarantees to interview all candidates who have a disability and who More ❯
AI/Data Developer - Contract - SC Cleared - £500 - £600pd (Outside of IR35) - Hybrid working Our client, a leading deep-tech organisation, is seeking an experienced AI/Data Developer for an urgent contract assignment. Key Requirements: Proven background in AI and data development Strong proficiency in Python , including data-focused libraries such as Pandas, NumPy, and … PySpark Hands-on experience with Apache Spark (PySpark preferred) Solid understanding of data management and processing pipelines Experience in algorithm development and graph data structures is advantageous Active SC Clearance is mandatory Role Overview: You will play a key role in developing and delivering advanced AI solutions for a Government client . Responsibilities include: Designing, building, and maintaining … data processing pipelines using Apache Spark Implementing ETL/ELT workflows for large-scale data sets Developing and optimising Python-based dataingestion tools Collaborating on the design and deployment of machine learning models Ensuring data quality, integrity, and performance across distributed systems Contributing to data architecture and storage strategy design Working with cloud More ❯
Our client a high profile deep-tech organisation, urgently require an experienced AI/Data Developer to undertake a contract assignment. In order to be successful, you will have the following experience: Extensive AI & Data Development background Experiences with Python (including data libraries such as Pandas, NumPy, and PySpark) and Apache Spark (PySpark preferred) Strong experience with … data management and processing pipelines Algorithm development and knowledge of graphs will be beneficial SC Clearance is essential Within this role, you will be responsible for: Supporting the development and delivery of AI solution to a Government customer Design, develop, and maintain data processing pipelines using Apache Spark Implement ETL/ELT workflows to extract, transform and load … large-scale datasets efficiently Develop and optimize Python-based applications for dataingestion Collaborate on development of machine learning models Ensure data quality, integrity, and performance across distributed environments Contribute to the design of data architectures, storage strategies, and processing frameworks Work with cloud data platforms (e.g., AWS, Azure, or GCP) to deploy scalable solutions More ❯
between rapid AI development and the stringent security and compliance requirements of the financial industry. You will design and implement robust, scalable, and secure MLOps platforms that enable our data scientists to innovate safely and at speed, ensuring the integrity, confidentiality, and availability of our models and data. Key Responsibilities Secure MLOps Platform Engineering: Design, implement, and manage secure … automated CI/CD pipelines specifically for machine learning models (MLOps), integrating security checks (SAST, DAST, SCA) and data validation gates. AI/ML Infrastructure Security: Harden and secure the underlying cloud infrastructure for AI/ML workloads, including GPU clusters, container orchestration (Kubernetes), and managed services (e.g., AWS SageMaker, Azure ML). Security by Design: Embed security controls … into every stage of the ML lifecycle (dataingestion, feature store, model training, deployment, monitoring). Implement secrets management, network security (firewalls, VPCs), and identity and access management (IAM) for data and model assets. Compliance & Governance: Ensure the MLOps platform adheres to stringent financial industry regulations (e.g., GDPR, SOX, PCI-DSS, SWIFT CSCF) and internal policies (Model More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Atrium Workforce Solutions Ltd
ELK SME Extension Professional experience in the design, maintenance and management of Elastic stacks (Elasticsearch, Logstash, Kibana) Experience of configuring and maintaining large Elastic clusters Experience working with large data sets and elastic indexing best practices. Good understanding on Visualisation components and techniques in Elasticsearch. Proven experience in performance management and tuning of Elasticsearch environment. Strong experience in writing … dataingestion pipelines using Logstash and other big data technologies. Please feel free to contact myself – Daisy Nguyen at Gibbs Consulting/Atrium UK for a confidential chat to know more details about the role. Please also note: Due to the volume of applications received for positions, it will not be possible to respond to all applications More ❯
site. Key Requirements Professional experience in the design, maintenance and management of Elastic stacks (Elasticsearch, Logstash, Kibana) Experience of configuring and maintaining large Elastic clusters Experience working with large data sets and elastic indexing best practices. Good understanding on Visualisation components and techniques in Elasticsearch. Proven experience in performance management and tuning of Elasticsearch environment. Strong experience in writing … dataingestion pipelines using Logstash and other big data technologies. Are you interested in this position? If so, then please respond with your CV and I will be in touch ASAP. More ❯