Role: Technical Data Architect Location: UK (remote is fine, candidate must be in UK and available for any IMP meeting) Contract: 3 months (will likelyextend) Data Feed Factory - for Telco Network Function Expectations on the role : Hands on Experience on design & architecture of the Data Platform, Technical components and Data Engg. related activities for supporting the … key capabilities Technical Data Architecture DataIngestion framework ( Batch/Micro Batch processing ) Data Contract & Data management( including DQ, metadata & lineage) Delivery, Scaling & Op Model Discovery/assessment along with the Data platform requirements Identify and document the relevant data sources to be used for collection & ingestion, cataloging, defining Data management. … This includes assessing data quality, accessibility, data contract and lineage including any other potential capability evaluation Analyze existing Data pipeline and data workflows to determine the integration points and any design considerations/framework and patterns for the data factory as per the need for TSA and Non-TSA batch business/technical requirements for More ❯
Job Title: Data Quality Analyst Location: London (2 days per week on-site at Liverpool Street) Contract: 6 months (via umbrella) Rate: Competitive Are you passionate about driving data accuracy and integrity in a fast-paced financial services environment? This is a fantastic opportunity to join a leading international bank at the forefront of data-led transformation. … You'll be part of a growing Data Office that is shaping strategy, governance, and innovation across EMEA - making a real impact from day one. The Role As a Data Quality Analyst, you will play a pivotal role in ensuring the accuracy, completeness, and integrity of data across AML and Sanctions screening platforms. You will work closely … with Financial Crime, Data Governance, Technology, and Risk teams to design and embed effective data quality controls, strengthen governance, and support regulatory compliance. This role offers the chance to directly contribute to critical transformation programmes within financial crime compliance. Key Responsibilities Design, build, and monitor Data Quality Rules within Collibra Data Quality (CDQ) for AML and More ❯
Data Engineer £500 - £560 per day London - 1 day per week in office We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role … You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing … DBT models and Airflow pipelines within a modern data stack. Building robust dataingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines for data workflows. Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents). More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half Technology are assisting a market leading financial services organisation to recruit a Data Engineer on a contract basis - Hybrid working - London based Looking for a highly skilled Data Engineer to join our growing data platform team. You will be responsible for designing, building, and optimising large-scale data pipelines and services that power analytics … reporting, and data-driven decision-making across the business. This role is heavily cloud-focused (AWS) and requires strong expertise in Python, Spark, and relational databases. Role Design, build, and maintain scalable, reliable, and high-performance data pipelines and workflows. Develop clean, maintainable, and testable code in Python for dataingestion, transformation, and processing. Optimise and … fine-tune query performance on Aurora Postgres and other relational databases. Architect and manage data solutions on AWS using serverless technologies such as Lambda, Glue, Glue Data Catalog, EMR serverless, and API Gateway. Implement and manage large-scale data processing with Spark (Iceberg tables in S3, Gold layer in Aurora Postgres). Collaborate with data scientists More ❯
seeking to recruit a DataOps Enginer on an initial 6 month contract based in London. It is hybrid and will require 2/3x days onsite per week. A Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizing biomedical and scientific data engineering, with demonstrable experience across … the following areas:?? They are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: - Building a next-generation, metadata- and automation-driven data experience for scientists, engineers, and decision-makers, increasing productivity and reducing time … spent on "data mechanics"- Providing best-in-class AI/ML and data analysis environments to accelerate our predictive capabilities and attract top-tier talent- Aggressively engineering our data at scale, as one unified asset, to unlock the value of our unique collection of data and predictions in real-timeAutomation of end-to-end dataMore ❯
seeking to recruit a DataOps Enginer on an initial 6 month contract based in London. It is hybrid and will require 2/3x days onsite per week. A Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizing biomedical and scientific data engineering, with demonstrable experience across … the following areas: They are a full-stack shop consisting of product and portfolio leadership, data engineering, infrastructure and DevOps, data/metadata/knowledge platforms, and AI/ML and analysis platforms, all geared toward: - Building a next-generation, metadata- and automation-driven data experience for scientists, engineers, and decision-makers, increasing productivity and reducing time … spent on "data mechanics" - Providing best-in-class AI/ML and data analysis environments to accelerate our predictive capabilities and attract top-tier talent - Aggressively engineering our data at scale, as one unified asset, to unlock the value of our unique collection of data and predictions in real-time Automation of end-to-end dataMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
Palantir Foundry Data Engineer - DV Cleared NEW CONTRACT OPPORTUNITY FOR A PALANTIR FOUNDRY DATA ENGINEER TO WORK ON A NATIONAL SECURITY PROJECT IN LONDON WITH DV CLEARANCE - Contract role in London for a Palantir Foundry Data Engineer - Must hold DV Security Clearance - Central London based - Daily rate up to £800 - Hybrid position - To apply, email: or call … Who we are... We are seeking an experienced Palantir Foundry Data Engineer with current DV clearance to join a high-profile programme. This is a contract position offering hybrid working and a daily rate of up to £800 . In this role, you will be responsible for designing, developing, and optimising data pipelines and integrations within Palantir Foundry … ensuring data is efficiently processed, transformed, and made available for analysis and operational use. You will collaborate closely with analysts, data scientists, and business stakeholders to deliver robust, secure, and scalable data solutions. What we're looking for... Key Responsibilities: Develop and maintain data pipelines and workflows in Palantir Foundry Integrate diverse data sources, ensuring More ❯
Project Manager (Data Lakes projects) Our client is building a Data Lake in Azure needs an experienced Project Manager with a solid proven background in managing and running a databricks and dataingestion frameworks project. Must be experienced in dealing with difficult customers and hit the ground running. If you are an experience PM in running … large Data projects with Azure please get in touch for full details The role is full remote - YOU MUST LIVE IN THE UK as there may be the occasional site visits Contract until May 2026 could go on longer I will only reply to candidates who are Project Managers with databricks and dataingestion frameworks project experience More ❯
MI & Data Integration Analyst (Azure/Databricks) Hybrid - Central London | £43,000 pro rata | 3-month FTC (potential to extend) We're looking for a technically strong and commercially aware MI & Data Integration Analyst to join a leading UK retail and technology business on an initial 3-month fixed-term contract . This is a great opportunity for … someone who enjoys combining hands-on data analysis, reporting, and Azure integration work - helping build robust data foundations and self-serve reporting capability for an evolving business area. The Opportunity You'll be supporting a small, forward-thinking team to improve how they use data - helping them access accurate and timely MI, streamline processes, and migrate to … a new Azure-based data environment. You'll take ownership of reporting improvements, work closely with third-party partners on offshoring data processes, and support the move of key datasets into the Unity Catalog environment. Key Responsibilities Produce and enhance accurate, timely, and insightful MI reports to support business decision-making. Develop and maintain data tables and More ❯
join our MarTech team at Aviva. In this role, youll take full ownership of building, managing, and optimising audiences within Adobe Experience Platform (AEP) using the real-time Customer Data Platform (rtCDP). Your work will enable personalised, data-driven experiences across paid media, website, app, and email channelsensuring the right message reaches the right customer at the … right time. This is a high-impact role suited to someone passionate about data-driven marketing and audience strategy. If you thrive in a fast-paced environment and enjoy blending technical capability with strategic thinking, wed love to hear from you. A bit about the job: This is a hands-on role where technical expertise meets marketing activation. Youll … digital channels. Collaboration will be key, as youll work closely with product and platform owners, go-to-market managers, and channel specialists to maximise the value of Avivas customer data and Adobe MarTech ecosystem. Youll be instrumental in managing customer profiles, identities, and the identity graph to maintain unified customer views. This includes overseeing dataingestion pipelinesEdge More ❯
month contract (3 days per week/24 hours per week) Company Introduction: We are currently recruiting an experienced Microsoft Power Platform Application Designer/Developer for a global Data Analytic client in London. This is a 3 day per week role - 24 working hours per week. Required Skills/Experience: Proven expertise with Power Automate, Power Apps, Power … BI, and Copilot Studio. Experience creating and maintaining Fabric data agents and integrating them into Power Platform and Copilot solutions. Power Apps : Experience developing complex Canvas Apps with multiple data connections. Proficiency in advanced expressions and formulas. Understanding of Model-Driven Apps and Dataverse schema design. Power Automate: Experience building large, multi-branch flows with complex logic and … varied data connectors. Skilled in error handling, performance optimisation, and flow governance. Strong SQL skills for data modelling, querying, and optimisation within Fabric and Power BI. Knowledge of Python for data manipulation, API automation, and AI model integration. Job Responsibilities/Objectives: Develop solutions using Power Automate, Power Apps, Power BI, and Copilot Studio. Create and manage More ❯
as a group Assisting with the maintenance and iteration of a FastAPI-based web API hosted on AWS. Enhancing the existing product in line with the roadmap, focusing on dataingestion, UI updates (with a Fullstack Developer), and potential agentic features. Supporting the design and delivery of a new capability converting audience data and insights into conversational … design, testing strategies, and technical direction, including Generative AI solutions involving RAG and prompt engineering. Maintaining documentation and Agile project processes. Assisting with new feature development in sentiment analysis, dataingestion, and synthetic audience creation. As a member of the Disability Confident Scheme, Circle and our Client guarantees to interview all candidates who have a disability and who More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
as a group Assisting with the maintenance and iteration of a FastAPI-based web API hosted on AWS. Enhancing the existing product in line with the roadmap, focusing on dataingestion, UI updates (with a Fullstack Developer), and potential agentic features. Supporting the design and delivery of a new capability converting audience data and insights into conversational … design, testing strategies, and technical direction, including Generative AI solutions involving RAG and prompt engineering. Maintaining documentation and Agile project processes. Assisting with new feature development in sentiment analysis, dataingestion, and synthetic audience creation. As a member of the Disability Confident Scheme, Circle and our Client guarantees to interview all candidates who have a disability and who More ❯
AI/Data Developer - Contract - SC Cleared - £500 - £600pd (Outside of IR35) - Hybrid working Our client, a leading deep-tech organisation, is seeking an experienced AI/Data Developer for an urgent contract assignment. Key Requirements: Proven background in AI and data development Strong proficiency in Python , including data-focused libraries such as Pandas, NumPy, and … PySpark Hands-on experience with Apache Spark (PySpark preferred) Solid understanding of data management and processing pipelines Experience in algorithm development and graph data structures is advantageous Active SC Clearance is mandatory Role Overview: You will play a key role in developing and delivering advanced AI solutions for a Government client . Responsibilities include: Designing, building, and maintaining … data processing pipelines using Apache Spark Implementing ETL/ELT workflows for large-scale data sets Developing and optimising Python-based dataingestion tools Collaborating on the design and deployment of machine learning models Ensuring data quality, integrity, and performance across distributed systems Contributing to data architecture and storage strategy design Working with cloud More ❯
I am working with a client in the education sector who are looking for a data engineer with experience across architect & strategy to join on a part-time 12 month contract.1-2 days per weekFully remoteOutside IR35Immediate start12 month contract Essential Been to school in the UK DataIngestion of APIs GCP based (Google Cloud Platform) Snowflake More ❯
I am working with a client in the education sector who are looking for a data engineer with experience across architect & strategy to join on a part-time 12 month contract. 1-2 days per week Fully remote Outside IR35 Immediate start 12 month contract Essential Been to school in the UK DataIngestion of APIs GCP More ❯
Crime Enhancement Project focused on Sanctions and PEP screening. What you'll do: Administer and configure LexisNexis Bridger Insight for sanctions and PEP screening workflows. Run screening jobs, manage dataingestion, and generate reports within Bridger. Set up users, permissions, and workflows tailored to project requirements. Collaborate with internal teams and external consultants to backfill and transition responsibilities. … Strong understanding of Sanctions and PEP screening processes. Background in Financial Crime, AML, or Compliance projects. Ability to manage screening engines, workflows, and user configurations. Comfortable running jobs, handling data files, and producing reports specific to Bridger functionality. Next steps We have a diverse workforce and an inclusive culture at M&G plc, underpinned by our policies and our More ❯
Crime Enhancement Project focused on Sanctions and PEP screening. What you'll do: Administer and configure LexisNexis Bridger Insight for sanctions and PEP screening workflows. Run screening jobs, manage dataingestion, and generate reports within Bridger. Set up users, permissions, and workflows tailored to project requirements. Collaborate with internal teams and external consultants to backfill and transition responsibilities. … Strong understanding of Sanctions and PEP screening processes. Background in Financial Crime, AML, or Compliance projects. Ability to manage screening engines, workflows, and user configurations. Comfortable running jobs, handling data files, and producing reports specific to Bridger functionality. Next steps We have a diverse workforce and an inclusive culture at M&G plc, underpinned by our policies and our More ❯