The Role The Company is seeking a skilled and detail-oriented Business Analyst to join our Data Management team. This role plays a key part in delivering high-impact data solutions to our clients, supporting business intelligence, data migration and reporting initiatives across core insurance domains. The ideal candidate will have a strong analytical mindset, a solid … understanding of insurance operations, and hands-on experience working with data throughout its lifecycle — from analysis and mapping through to validation and user acceptance testing. You will work closely with cross-functional teams including project managers, actuaries, data engineers, and business stakeholders to ensure that data solutions meet business needs and quality standards. This is a … great opportunity to contribute to transformational projects within the insurance industry, leveraging your skills in data analysis, stakeholder engagement, and documentation, while growing within a dynamic and collaborative environment. Key Responsibilities Analyze and understand business requirements related to core insurance data (e.g., policies, claims, customers, billing) to support modeling, data migration, reporting, and analytics initiatives. Collaborate with More ❯
The Role The Company is seeking a skilled and detail-oriented Business Analyst to join our Data Management team. This role plays a key part in delivering high-impact data solutions to our clients, supporting business intelligence, data migration and reporting initiatives across core insurance domains. The ideal candidate will have a strong analytical mindset, a solid … understanding of insurance operations, and hands-on experience working with data throughout its lifecycle — from analysis and mapping through to validation and user acceptance testing. You will work closely with cross-functional teams including project managers, actuaries, data engineers, and business stakeholders to ensure that data solutions meet business needs and quality standards. This is a … great opportunity to contribute to transformational projects within the insurance industry, leveraging your skills in data analysis, stakeholder engagement, and documentation, while growing within a dynamic and collaborative environment. Key Responsibilities Analyze and understand business requirements related to core insurance data (e.g., policies, claims, customers, billing) to support modeling, data migration, reporting, and analytics initiatives. Collaborate with More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Position: Analytics Engineer Location: London (hybrid 3 times a week)Department: ITType: 3 months rolling contractOutside IR35 The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making. Key Responsibilities Design and … build data products, with proficiency throughout the data lifecycle. Develop robust data models through close collaboration with business users and the engineering team. Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks. Accountable for data extraction, transforming JSON & XML, utlising high experience … within metadata management. Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning. Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis. Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers. To work independently and to manage multiple dataMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Analytics Engineer Location: London (hybrid 3 times a week) Department: IT Type: 3 months rolling contract Outside IR35 The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making. Key Responsibilities Design … and build data products, with proficiency throughout the data lifecycle. Develop robust data models through close collaboration with business users and the engineering team. Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks. Accountable for data extraction, transforming JSON & XML, utlising high … experience within metadata management. Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning. Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis. Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers. To work independently and to manage multiple dataMore ❯
Pay: £450 - £550 per day Location - London (Hybrid) | Python | ETL | Impact-Driven Team We are partnering with a leading client seeking an Azure Data Engineer to join their team on an initial 6-month contract . This hybrid role requires 2 days per week onsite , offering the opportunity to design, build, and maintain cutting-edge data platforms. Data Engineering & Architecture: Design, build, and maintain scalable data pipelines and ETL/ELT processes using Azure services. Develop and optimize data lake and data warehouse solutions (eg, Azure Data Lake Storage, Azure Synapse Analytics). Implement best practices in data modelling, partitioning, and performance optimization. Support Real Time and batch data processing workloads. … Data Quality, Governance & Security Implement datavalidation, monitoring, and quality frameworks to ensure clean and trustworthy data. Work with the data governance team to ensure compliance with standards, policies, and privacy regulations. Maintain metadata, lineage, and documentation for all data solutions. Collaboration & Stakeholder Support Partner with analytics, product, and engineering teams to understand dataMore ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
Pay: £450 - £550 per day Location - London (Hybrid) | Python | ETL | Impact-Driven Team We are partnering with a leading client seeking an Azure Data Engineer to join their team on an initial 6-month contract . This hybrid role requires 2 days per week onsite , offering the opportunity to design, build, and maintain cutting-edge data platforms.Data Engineering … Architecture: Design, build, and maintain scalable data pipelines and ETL/ELT processes using Azure services. Develop and optimize data lake and data warehouse solutions (eg, Azure Data Lake Storage, Azure Synapse Analytics). Implement best practices in data modelling, partitioning, and performance optimization. Support Real Time and batch data processing workloads. Data Quality, Governance & Security Implement datavalidation, monitoring, and quality frameworks to ensure clean and trustworthy data. Work with the data governance team to ensure compliance with standards, policies, and privacy regulations. Maintain metadata, lineage, and documentation for all data solutions. Collaboration & Stakeholder Support Partner with analytics, product, and engineering teams to understand data needs More ❯
Automation Engineer to join a leading financial services organisation on a 6-month contract. This hands-on role focuses on delivering high-quality automated testing across web, API, and datavalidation for investment and wealth management platforms. You will execute automation scripts, validate complex data flows, test RESTful APIs, and ensure releases meet reliability and performance standards. … Role & Responsibilities Develop, maintain, and execute automation frameworks for web, API, and datavalidation testing within wealth management and trading systems Perform functional, regression, performance, and integration testing for complex financial applications Test RESTful APIs using tools such as Postman or RestAssured Validate data and perform back-end testing using SQL Integrate automated tests within CI/… in QA engineering within financial services or other complex enterprise environments Strong Python scripting skills for automation Experience with Avaloq (desirable) Hands-on experience testing RESTful APIs and performing datavalidation Familiarity with CI/CD pipelines, version control (Git), and Agile methodologies Strong SQL skills and experience with back-end testing Excellent analytical and problem-solving abilities More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Intec Select
Automation Engineer to join a leading financial services organisation on a 6-month contract. This hands-on role focuses on delivering high-quality automated testing across web, API, and datavalidation for investment and wealth management platforms. You will execute automation scripts, validate complex data flows, test RESTful APIs, and ensure releases meet reliability and performance standards. … Role & Responsibilities Develop, maintain, and execute automation frameworks for web, API, and datavalidation testing within wealth management and trading systems Perform functional, regression, performance, and integration testing for complex financial applications Test RESTful APIs using tools such as Postman or RestAssured Validate data and perform back-end testing using SQL Integrate automated tests within CI/… in QA engineering within financial services or other complex enterprise environments Strong Python scripting skills for automation Experience with Avaloq (desirable) Hands-on experience testing RESTful APIs and performing datavalidation Familiarity with CI/CD pipelines, version control (Git), and Agile methodologies Strong SQL skills and experience with back-end testing Excellent analytical and problem-solving abilities More ❯
Senior Data Engineer Job Type : Permanent Location: Hub Location (London/Reading/Birmingham/Glasgow/Leeds) Salary: £60,000 - £70,0000 Consultant: Mags Rendle Excellent career opportunity! The Organisation TDRC are excited to be once again working with one of the UK’s leading Wealth Management firms. Through their robust infrastructure, cutting edge technology, and exceptional support … reliable service to their continuously expanding client base. They are providing the utmost level of quality and security in everything they do. Key Responsibilities Design, build, and maintain Azure Data Factory (ADF) pipelines to support complex data integration and transformation needs. Develop scalable and optimised data models in Azure Synapse Analytics for analytical and operational reporting. Govern … and enhance Power BI datasets and semantic models, ensuring best practice in DAX, performance tuning, and report design. Lead the development of data architecture and standards across Azure Data Lake and Synapse environments. Contribute to the migration to Microsoft Fabric, shaping platform architecture and integration patterns. Implement and maintain datavalidation, monitoring, and quality frameworks. Work More ❯
Senior Data Engineer Job Type : Permanent Location: Hub Location (London/Reading/Birmingham/Glasgow/Leeds) Salary: £60,000 - £70,0000 Consultant: Mags Rendle Excellent career opportunity! The Organisation TDRC are excited to be once again working with one of the UK’s leading Wealth Management firms. Through their robust infrastructure, cutting edge technology, and exceptional support … reliable service to their continuously expanding client base. They are providing the utmost level of quality and security in everything they do. Key Responsibilities Design, build, and maintain Azure Data Factory (ADF) pipelines to support complex data integration and transformation needs. Develop scalable and optimised data models in Azure Synapse Analytics for analytical and operational reporting. Govern … and enhance Power BI datasets and semantic models, ensuring best practice in DAX, performance tuning, and report design. Lead the development of data architecture and standards across Azure Data Lake and Synapse environments. Contribute to the migration to Microsoft Fabric, shaping platform architecture and integration patterns. Implement and maintain datavalidation, monitoring, and quality frameworks. Work More ❯
Bioinformatics | Data Engineer 📍 London OR Oxford About the Company We are a growing TechBio organization using data science and machine learning to accelerate drug discovery. Our teams integrate multi-omics and functional assay data to uncover insights into disease biology and therapeutic development. The Role We’re seeking a Bioinformatics Data Engineer to design, build, and … optimize data pipelines that integrate large-scale biological, multi-omics, and experimental datasets. You’ll collaborate closely with scientists, bioinformaticians, and ML engineers to deliver robust, compliant, and reusable data solutions that drive research and discovery. Key Responsibilities Develop and maintain ETL pipelines for bioinformatics and omics datasets across cloud and on-prem environments. Standardize and harmonize diverse … data sources, ensuring metadata quality and FAIR compliance. Integrate multi-modal datasets (genomic, transcriptomic, proteomic, imaging, etc.) into unified data models. Automate datavalidation, quality control, and lineage tracking. Support analytics, visualization, and machine learning workflows. Contribute to data governance practices covering access, privacy, and lifecycle management. Qualifications Bachelor’s or Master’s in Bioinformatics More ❯
Bioinformatics | Data Engineer 📍 London OR Oxford About the Company We are a growing TechBio organization using data science and machine learning to accelerate drug discovery. Our teams integrate multi-omics and functional assay data to uncover insights into disease biology and therapeutic development. The Role We’re seeking a Bioinformatics Data Engineer to design, build, and … optimize data pipelines that integrate large-scale biological, multi-omics, and experimental datasets. You’ll collaborate closely with scientists, bioinformaticians, and ML engineers to deliver robust, compliant, and reusable data solutions that drive research and discovery. Key Responsibilities Develop and maintain ETL pipelines for bioinformatics and omics datasets across cloud and on-prem environments. Standardize and harmonize diverse … data sources, ensuring metadata quality and FAIR compliance. Integrate multi-modal datasets (genomic, transcriptomic, proteomic, imaging, etc.) into unified data models. Automate datavalidation, quality control, and lineage tracking. Support analytics, visualization, and machine learning workflows. Contribute to data governance practices covering access, privacy, and lifecycle management. Qualifications Bachelor’s or Master’s in Bioinformatics More ❯
Role – Technology Lead Technology – Kafka, Confluent cloud and Data Integration Location – London, UK Job Description We are looking for an experienced Kafka and Confluent Cloud Specialist to join our team onsite as a Subject Matter Expert (SME) and primary technical lead for all data onboarding activities across on-premises and cloud environments. This role demands strong expertise in … event streaming platforms, excellent SQL skills, and the ability to collaborate with multiple stakeholders to ensure smooth data integration and delivery. As a secondary skill, experience with Fivetran, SSIS, ADF or Snowflake will be considered a strong plus. Your role In the role of a Data Integration Lead , you will drive the design, implementation, and optimization of enterprise … grade data integration solutions, with a strong focus on real-time streaming and cloud-native platforms. You will lead initiatives involving Apache Kafka and Confluent Cloud, enabling scalable, event-driven architectures that support high-throughput, low-latency data movement across systems. You will anchor engagements from requirement gathering and solution architecture to development and deployment, ensuring seamless integration More ❯
Role – Technology Lead Technology – Kafka, Confluent cloud and Data Integration Location – London, UK Job Description We are looking for an experienced Kafka and Confluent Cloud Specialist to join our team onsite as a Subject Matter Expert (SME) and primary technical lead for all data onboarding activities across on-premises and cloud environments. This role demands strong expertise in … event streaming platforms, excellent SQL skills, and the ability to collaborate with multiple stakeholders to ensure smooth data integration and delivery. As a secondary skill, experience with Fivetran, SSIS, ADF or Snowflake will be considered a strong plus. Your role In the role of a Data Integration Lead , you will drive the design, implementation, and optimization of enterprise … grade data integration solutions, with a strong focus on real-time streaming and cloud-native platforms. You will lead initiatives involving Apache Kafka and Confluent Cloud, enabling scalable, event-driven architectures that support high-throughput, low-latency data movement across systems. You will anchor engagements from requirement gathering and solution architecture to development and deployment, ensuring seamless integration More ❯
Data Engineer — Hedge Fund (L/S Equity) | London A fundamentally driven long/short equity fund focused on the European Industrials sector is seeking a Data Engineer for a variety of greenfield development. Founded by a former Citadel portfolio manager, the team brings together expertise from leading global institutions such as Citadel, DE Shaw and Millennium. They … combine rigorous data analysis with industry insight to anticipate industrial cycles ahead of the market, acting decisively when conviction and evidence align. Role Overview We are seeking a highly skilled Data Engineer to design, build and maintain the core data infrastructure powering the investment and risk processes. This is a fully hands-on role within a small … high-calibre team, offering significant impact and ownership. You will develop robust, production-grade ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated datavalidation processes, optimising data quality and performance, and ensuring high availability across all More ❯
Data Engineer — Hedge Fund (L/S Equity) | London A fundamentally driven long/short equity fund focused on the European Industrials sector is seeking a Data Engineer for a variety of greenfield development. Founded by a former Citadel portfolio manager, the team brings together expertise from leading global institutions such as Citadel, DE Shaw and Millennium. They … combine rigorous data analysis with industry insight to anticipate industrial cycles ahead of the market, acting decisively when conviction and evidence align. Role Overview We are seeking a highly skilled Data Engineer to design, build and maintain the core data infrastructure powering the investment and risk processes. This is a fully hands-on role within a small … high-calibre team, offering significant impact and ownership. You will develop robust, production-grade ETL pipelines, manage cloud-based and on-prem data environments, and integrate diverse datasets from external vendors, trading systems and internal sources. The role also involves implementing automated datavalidation processes, optimising data quality and performance, and ensuring high availability across all More ❯
Location: London, England, United Kingdom Join Axon and be a Force for Good. Design, build, and maintain scalable, secure, and efficient data pipelines using modern data tools and cloud-native technologies Work with a variety of data sources-structured, semi-structured, unstructured-to support diverse analytics and operational workloads Ensure data quality and consistency by incorporating … automated testing, datavalidation, and monitoring mechanisms Drive best practices around data engineering, including testing, observability, security, and documentation Troubleshoot and resolve issues in production environments to ensure data integrity and platform reliability What you bring: 5+ years of professional experience in data engineering roles, preferably for a customer facing data product Expertise in … designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast More ❯
Data Analyst – SC Cleared - AWSWe’re looking for a Data Analyst (SFIA Level 4) to join a high-profile government data transformation programme. The role will support an existing team, working across data modelling, analytics, and performance optimisation within the AWS Athena environment.This is not a data engineering or build role; it’s focused on … understanding, analysing, and optimising existing data models and SQL scripts to improve efficiency, structure, and insight quality.Due to the nature of the role, active SC clearance is required.Key Responsibilities Analyse and interpret existing SQL/NoSQL scripts, optimising performance and accuracy. Review, document, and enhance data models in AWS Athena and related warehousing environments. Provide insight and recommendations … on data modelling standards, table structures, and query logic. Support the wider MIDAS analytical and engineering team on datavalidation and reporting improvements. Contribute to the delivery of a robust data warehousing and analytics environment in AWS. Produce technical documentation and maintain clear audit trails of analysis work. Essential Skills & Experience Proven experience as a DataMore ❯
Excel Modeller – Financial Data Modelling & Process Design Location: London (Hybrid 3 days per week in the office) Are you an Excel expert with a passion for financial data and process automation? This is your opportunity to take a high-impact role at the intersection of finance, data, and technology , helping to shape next-generation data collection … environment. This role will be paying £500 - £600 p/d DOE. What You’ll Do Design advanced Excel templates for P&L, balance sheet, cash flow, and KPI data across portfolio companies. Implement datavalidation, consistency checks, and version control for clean, auditable data. Automate workflows using Excel formulas, VBA/macros, Power Query , and integrate … with external systems like Snowflake . Collaborate with finance and tech teams to streamline data flows and improve reporting accuracy. Drive process improvement and governance , standardizing KPIs and reporting structures. What We’re Looking For 6–10 years’ experience in financial analysis, data process design, or financial systems integration (Big 4 FDnA/TS or PE/Asset More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Marks Sattin
Excel Modeller – Financial Data Modelling & Process Design Location: London (Hybrid 3 days per week in the office) Are you an Excel expert with a passion for financial data and process automation? This is your opportunity to take a high-impact role at the intersection of finance, data, and technology , helping to shape next-generation data collection … environment. This role will be paying £500 - £600 p/d DOE. What You’ll Do Design advanced Excel templates for P&L, balance sheet, cash flow, and KPI data across portfolio companies. Implement datavalidation, consistency checks, and version control for clean, auditable data. Automate workflows using Excel formulas, VBA/macros, Power Query , and integrate … with external systems like Snowflake . Collaborate with finance and tech teams to streamline data flows and improve reporting accuracy. Drive process improvement and governance , standardizing KPIs and reporting structures. What We’re Looking For 6–10 years’ experience in financial analysis, data process design, or financial systems integration (Big 4 FDnA/TS or PE/Asset More ❯
The Role We are looking for a skilled and experienced Senior Data Engineer to join our Data Science team. The team ingests large amounts of complex sensor data (billions of data points a day), combines it with data from other teams, and produces advanced modelling products that help people park their car or charge their … learning models are high-quality production services and are updated regularly using fresh data. You will lead the design, development, and enhancement of pipelines to ingest and process streaming data for use in our machine learning models. You will be an important member of our team, lead engineering initiatives and work with smart colleagues in a supportive environment. Responsibilities … You will develop pipelines for scalable big data processing with Spark, and real-time data streaming with Kafka. These pipelines will need to be written using efficient, testable, and reusable Python code using (for example) Numpy, Pandas and Pyspark. We manage our numerous pipelines using Airflow to meet our data serving and modelling requirements. Our services are More ❯
The Role We are looking for a skilled and experienced Senior Data Engineer to join our Data Science team. The team ingests large amounts of complex sensor data (billions of data points a day), combines it with data from other teams, and produces advanced modelling products that help people park their car or charge their … learning models are high-quality production services and are updated regularly using fresh data. You will lead the design, development, and enhancement of pipelines to ingest and process streaming data for use in our machine learning models. You will be an important member of our team, lead engineering initiatives and work with smart colleagues in a supportive environment. Responsibilities … You will develop pipelines for scalable big data processing with Spark, and real-time data streaming with Kafka. These pipelines will need to be written using efficient, testable, and reusable Python code using (for example) Numpy, Pandas and Pyspark. We manage our numerous pipelines using Airflow to meet our data serving and modelling requirements. Our services are More ❯
A leading global investment firm is seeking a Market Data Engineer to join its London-based team and help build robust, scalable tick data infrastructure that powers systematic trading strategies across asset classes. This is a high-impact engineering role focused on designing and optimizing real-time and historical market data pipelines in a cloud-native environment. … You'll work closely with trading and quant teams to ensure data is clean, fast, and accessible - enabling research, signal generation, and execution at scale. What You'll Do Build and maintain high-performance tick data pipelines for ingesting, processing, and storing large volumes of market data. Work with time-series databases (e.g., KDB, OneTick) and Parquet-based … file storage to optimize data access and retrieval. Design scalable cloud-native solutions (AWS preferred) for market data ingestion and distribution. (Bonus) Integrate Apache Iceberg for large-scale data lake management and versioned data workflows. Collaborate with trading and engineering teams to define data requirements and deliver production-grade solutions. Implement robust datavalidationMore ❯
methodologies. Python Development - Strong proficiency in Python programming including object-oriented design, asynchronous programming, error handling, and writing clean, maintainable code. Experience with key libraries including Pandas, NumPy for data manipulation, requests and APIs for integrations, asyncio for concurrent processing, and building robust automation scripts with proper logging, testing (pytest), and documentation. AI & Machine Learning Frameworks - Deep expertise in … AI-powered automation solutions that leverage natural language understanding. Appian BPA Platform - Strong experience with Appian low-code platform including process modelling, interface design, expression rules, integration objects, and data modelling. Skilled in building end-to-end business process applications, configuring workflows, implementing business rules, managing records, and integrating Appian with external systems via REST APIs, web services, and … solutions using OCR technologies (Tesseract, Azure AI Document Intelligence, natural language processing for information extraction, document classification, and building end-to-end pipelines for automated document ingestion, processing, and data extraction with validation rules. Robotic Process Automation (RPA) - Knowledge of RPA concepts and tools (UiPath, Automation Anywhere, Power Automate) for automating repetitive tasks, screen scraping, and Legacy system More ❯
methodologies. Python Development - Strong proficiency in Python programming including object-oriented design, asynchronous programming, error handling, and writing clean, maintainable code. Experience with key libraries including Pandas, NumPy for data manipulation, requests and APIs for integrations, asyncio for concurrent processing, and building robust automation scripts with proper logging, testing (pytest), and documentation. AI & Machine Learning Frameworks - Deep expertise in … AI-powered automation solutions that leverage natural language understanding. Appian BPA Platform - Strong experience with Appian low-code platform including process modelling, interface design, expression rules, integration objects, and data modelling. Skilled in building end-to-end business process applications, configuring workflows, implementing business rules, managing records, and integrating Appian with external systems via REST APIs, web services, and … solutions using OCR technologies (Tesseract, Azure AI Document Intelligence, natural language processing for information extraction, document classification, and building end-to-end pipelines for automated document ingestion, processing, and data extraction with validation rules. Robotic Process Automation (RPA) - Knowledge of RPA concepts and tools (UiPath, Automation Anywhere, Power Automate) for automating repetitive tasks, screen scraping, and legacy system More ❯