be doing Support end-to-end deployment of ML models (batch and real-time) from code validation through to production rollout under guidance from senior team members. Work with Data Science teams to facilitate smooth model handover and ensure deployment readiness aligned with implementation standards. Build and maintain CI/CD pipelines for model deployment, scoring, and operational monitoring. … Debug and fix pipeline issues including dataingestion problems, model scoring failures, and deployment errors. Write comprehensive tests for ML pipelines (unit, integration, validation) and implement data quality checks and operational monitoring. Ensure deployed models meet audit, reconciliation, and governance requirements. Monitor production models for operational health, troubleshoot failures, and track data/variable drift over … time. Work with Platform Engineers within the team to create reusable MLOps templates and support Data Scientists in using them effectively. Support model migrations across data sources, tools, systems, and platforms. Participate in code reviews, knowledge sharing, and pod activities (standups, grooming, delivery check-ins). Learn from senior team members and contribute to continuous improvement of model More ❯
required skills. Support end-to-end deployment of ML models (batch and real-time) from code validation through to production rollout under guidance from senior team members. Work with Data Science teams to facilitate smooth model handover and ensure deployment readiness aligned with implementation standards. Build and maintain CI/CD pipelines for model deployment, scoring, and operational monitoring. … Debug and fix pipeline issues including dataingestion problems, model scoring failures, and deployment errors. Write comprehensive tests for ML pipelines (unit, integration, validation) and implement data quality checks and operational monitoring. Ensure deployed models meet audit, reconciliation, and governance requirements. Monitor production models for operational health, troubleshoot failures, and track data/variable drift over … time. Work with Platform Engineers within the team to create reusable MLOps templates and support Data Scientists in using them effectively. Support model migrations across data sources, tools, systems, and platforms. Participate in code reviews, knowledge sharing, and pod activities (standups, grooming, delivery check-ins). Learn from senior team members and contribute to continuous improvement of model More ❯
ll build analytics pipelines that help deliver actionable, automated user insights from product adoption data. You'll collaborate directly with product analysts, engineers, and business stakeholders to build robust data integrations, including Salesforce, Gainsight, JIRA and Github. Your work will be pivotal in empowering data-driven decisions across our organization, significantly impacting product strategy and customer engagement. What … to expect? Analytics Pipelines & Data Modeling: Convert existing product analyst queries and reports into efficient and maintainable analytics pipelines primarily using dbt, and optimize data for self-service analytics in Tableau. Owning Analytics Projects: Take ownership of medium-to-large analytics projects, from scoping through implementation. Product Event Data: Transform and maintain scalable product adoption datasets, ensuring … ease of downstream integration and rapid onboarding of new events or features. Data Egestion: Develop and manage data pipelines for exporting curated datasets from Redshift to platforms like Salesforce and Gainsight using reverse ETL tools (e.g., Hightouch). DataIngestion: Own end-to-end responsibility for ingesting key productivity data from platforms such as GitHub More ❯
Search 5.0 is delighted to partner with an innovative AI tech start-up in the financial data sector, seeking a Rust Developer to join their exciting journey! Our client equips traders with tools that are faster, smarter, and easier to use. This is a fantastic opportunity to join a leading AI company whose Machine Learning product suite creates new … trading opportunities through real-time analytics, supported by a world-class, scalable, ultra-low latency architecture. The Data Platform is crucial to all capabilities and services, enabling dataingestion, analysis, querying, and storage at near-real-time speeds, setting our client apart from competitors. While most companies process data in days, hours, or minutes, our client … driving the technical direction ensuring requirements are met and building a scalable product. Develop and maintain the platform services that underpin our products, in tandem with SME’s in data science and quantitive finance. Contribute new products and technical ideas, providing feedback and collaboration sessions. Work closely with QA to ensure that systems are engineered accordingly What we’re More ❯
development of scalable trading platforms and internal developer tools. Collaborate with quant researchers, traders, and infrastructure teams to deliver performant solutions. Maintain and optimize trading infrastructure (low-latency systems, data pipelines, monitoring, etc.). Manage a team of software engineers, mentor junior developers, and drive best practices across the stack. Ensure high code quality through code reviews, CI/… workflows. Own deployment pipelines across cloud providers (AWS, GCP, Azure) using containerization (Docker, Kubernetes). Design and implement microservices and APIs for internal tools and external integrations. Work with data at scale (market dataingestion, real-time analytics, historical storage). Required Skills & Experience: Core Engineering: 5+ years of experience in software development, with exposure to both … and storage configurations. Familiarity with monitoring/logging tools (e.g., Prometheus, Grafana, ELK stack). Trading Systems & Finance: Solid understanding of trading infrastructure, latency optimization, execution systems, and market data feeds. Experience working in or with quantitative research, HFT, or hedge fund teams is highly desirable. Leadership: Proven experience managing and mentoring a team of 3–10 engineers. Ability More ❯
development of scalable trading platforms and internal developer tools. Collaborate with quant researchers, traders, and infrastructure teams to deliver performant solutions. Maintain and optimize trading infrastructure (low-latency systems, data pipelines, monitoring, etc.). Manage a team of software engineers, mentor junior developers, and drive best practices across the stack. Ensure high code quality through code reviews, CI/… workflows. Own deployment pipelines across cloud providers (AWS, GCP, Azure) using containerization (Docker, Kubernetes). Design and implement microservices and APIs for internal tools and external integrations. Work with data at scale (market dataingestion, real-time analytics, historical storage). Required Skills & Experience: Core Engineering: 5+ years of experience in software development, with exposure to both … and storage configurations. Familiarity with monitoring/logging tools (e.g., Prometheus, Grafana, ELK stack). Trading Systems & Finance: Solid understanding of trading infrastructure, latency optimization, execution systems, and market data feeds. Experience working in or with quantitative research, HFT, or hedge fund teams is highly desirable. Leadership: Proven experience managing and mentoring a team of 3–10 engineers. Ability More ❯
Data Engineer | Location: Leeds (4 days per week in office with some flexibility) | Salary: £60,000-£80,000 The Opportunity Our client is seeking a skilled and motivated Data Engineer to play a key role in the creation of a brand-new data platform within the Azure ecosystem including Azure Data Factory (ADF), Synapse and PySpark …/Databricks and Snowflake. You will be a dataingestion and ETL Pipeline guru, tackling complex problems at source in order to retrieve the data and ensure to can flow upstream to the Snowflake DWH. You will not be an analytics engineer or someone who operates at the latter stages of the development lifecycle unless you have … skills and desire to work on dataingestion, ETL/ELT. Key Responsibilities Build & Develop robust ETL/DataIngestion pipelines leveraging Azure Data Factory, Synapse, PySpark and Python. Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. CI More ❯
transforms, and handling seasonality. Proven ability to tune the performance of existing deployed forecasting models. Must have experience with Azure Machine Learning Python SDK v1/v2 to: Manage data, models, and environments Build/debug AML pipelines to stitch together multiple tasks (feature engineering, training, registering models, etc.) and production workflows using Azure ML pipelines Schedule Azure ML … jobs Deploy registered models to create endpoints. Good to have experience with K-Means clustering. Must utilized Azure services such as Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure Key Vault to architect and maintain scalable data solutions. Design, develop, and deploy new Azure Data Factory (ADF) pipelines for dataingestion, transformation … and logging, ensuring robustness and reliability. Proficiently transform and manipulate data using PySpark and Python, leveraging their capabilities to derive actionable insights from complex datasets. Collaborate with cross-functional teams to understand data requirements and translate them into effective technical solutions. Lead the implementation and optimization of CI/CD pipelines using Azure DevOps, ensuring a seamless build More ❯
transforms, and handling seasonality. Proven ability to tune the performance of existing deployed forecasting models. Must have experience with Azure Machine Learning Python SDK v1/v2 to: Manage data, models, and environments Build/debug AML pipelines to stitch together multiple tasks (feature engineering, training, registering models, etc.) and production workflows using Azure ML pipelines Schedule Azure ML … jobs Deploy registered models to create endpoints. Good to have experience with K-Means clustering. Must utilized Azure services such as Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure Key Vault to architect and maintain scalable data solutions. Design, develop, and deploy new Azure Data Factory (ADF) pipelines for dataingestion, transformation … and logging, ensuring robustness and reliability. Proficiently transform and manipulate data using PySpark and Python, leveraging their capabilities to derive actionable insights from complex datasets. Collaborate with cross-functional teams to understand data requirements and translate them into effective technical solutions. Lead the implementation and optimization of CI/CD pipelines using Azure DevOps, ensuring a seamless build More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
Senior Data Engineer - £82,000 - Hybrid & Remote Flexibility No Sponsorship Available Company We are a forward-thinking technology company dedicated to simplifying complex processes through innovative solutions. With a focus on enhancing efficiency and transparency, we strive to deliver outstanding experiences for our customers. Our team operates with a flexible hybrid working model, fostering collaboration and innovation in a … supportive environment. Responsibilities As a Senior Data Engineer, you'll play a pivotal role in advancing our data infrastructure to support our expanding scope and responsibilities. Key responsibilities include: Enhancing the reliability and stability of our data infrastructure. Managing dataingestion pipelines using tools such as Azure Data Factory (ADF) and Python. Ensuring the … quality of raw datasets to empower Data Analysts in creating robust data models. Deploying and managing data tools on Kubernetes (Airflow, Superset, RStudio Connect). Supporting Data Analytics through the management of DBT, DevOps, and deployment rules. You will have the opportunity to work end-to-end, making meaningful contributions within a small, agile team. Experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Senior Data Engineer - £82,000 - Hybrid & Remote Flexibility No Sponsorship Available Company We are a forward-thinking technology company dedicated to simplifying complex processes through innovative solutions. With a focus on enhancing efficiency and transparency, we strive to deliver outstanding experiences for our customers. Our team operates with a flexible hybrid working model, fostering collaboration and innovation in a … supportive environment. Responsibilities As a Senior Data Engineer, you'll play a pivotal role in advancing our data infrastructure to support our expanding scope and responsibilities. Key responsibilities include: Enhancing the reliability and stability of our data infrastructure. Managing dataingestion pipelines using tools such as Azure Data Factory (ADF) and Python. Ensuring the … quality of raw datasets to empower Data Analysts in creating robust data models. Deploying and managing data tools on Kubernetes (Airflow, Superset, RStudio Connect). Supporting Data Analytics through the management of DBT, DevOps, and deployment rules. You will have the opportunity to work end-to-end, making meaningful contributions within a small, agile team. Experience More ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi … to take the lead in building robust, Real Time data pipelines in a security-focused environment. This is a hands-on contract opportunity to make a real impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi … Handling Real Time dataingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring and optimising high-throughput data flows across on-prem and cloud environments Building insightful Kibana dashboards to support Business Intelligence and operational decision-making Maintaining documentation of dataMore ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi … to take the lead in building robust, real-time data pipelines in a security-focused environment. This is a hands-on contract opportunity to make a real impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi … Handling real-time dataingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring and optimising high-throughput data flows across on-prem and cloud environments Building insightful Kibana dashboards to support business intelligence and operational decision-making Maintaining documentation of dataMore ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi … to take the lead in building robust, real-time data pipelines in a security-focused environment. This is a hands-on contract opportunity to make a real impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi … Handling real-time dataingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring and optimising high-throughput data flows across on-prem and cloud environments Building insightful Kibana dashboards to support business intelligence and operational decision-making Maintaining documentation of dataMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
Job Title: Data Engineer (NPPV3 + SC Clearance Required)Location: Hybrid (UK-based)Contract: 3 Months (with potential extension/move to other projects)Rate: £550 per day (Outside IR35)A bout the Role:We are seeking an experienced Data Engineer with NPPV3 + SC clearance to join our dynamic team on a 3-month contract basis. This … role offering the flexibility of working both remotely and onsite. The contract has potential for extension or transition onto other exciting projects. Key Responsibilities:* Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and Azure Synapse.* Collaborate with cross-functional teams to ensure data quality and integration.* Support dataingestion, transformation, and … storage in cloud environments.* Troubleshoot and optimise data workflows for performance and reliability.* Maintain compliance with security protocols and clearance requirements. Essential Skills & Experience:* Must hold NPPV3 + SC clearance (this is a mandatory requirement).* Proven expertise in Azure Data Factory, Databricks, and Azure Synapse Analytics.* Strong experience in building and managing cloud-based data solutions. More ❯
My client, a prestigious multi-strat hedge fund, is seeking an exceptional Data Engineer/Developer to join their Reporting & Analytics team within their London office. In this role, you will play a key role in designing and developing next-generation reporting and data analytics solutions, with a strong focus on P&L reporting and operational decision-making. … As part of a rapidly growing firm, you will work on revamping existing implementations, optimising data pipelines, and implementing rigorous data standards to support key business functions across the firm. This role will offer continuous exposure to Portfolio Managers, Risk & Portfolio Research teams, Product Control, and Compliance, providing an opportunity to influence critical workflows and enhance data-driven decision-making across the organization. Responsibilities Design & implement robust data pipelines and ETL processes, ensuring high standards in data modeling, documentation, and testing. Develop and optimize SQL queries for data extraction, aggregation, and reporting. Collaborate with key business teams (Risk, Portfolio Research, Compliance) to build efficient and scalable reporting solutions. Support critical production processes, including More ❯
My client, a prestigious multi-strat hedge fund, is seeking an exceptional Data Engineer/Developer to join their Reporting & Analytics team within their London office. In this role, you will play a key role in designing and developing next-generation reporting and data analytics solutions, with a strong focus on P&L reporting and operational decision-making. … As part of a rapidly growing firm, you will work on revamping existing implementations, optimising data pipelines, and implementing rigorous data standards to support key business functions across the firm. This role will offer continuous exposure to Portfolio Managers, Risk & Portfolio Research teams, Product Control, and Compliance, providing an opportunity to influence critical workflows and enhance data-driven decision-making across the organization. Responsibilities Design & implement robust data pipelines and ETL processes, ensuring high standards in data modeling, documentation, and testing. Develop and optimize SQL queries for data extraction, aggregation, and reporting. Collaborate with key business teams (Risk, Portfolio Research, Compliance) to build efficient and scalable reporting solutions. Support critical production processes, including More ❯
Christchurch, Dorset, United Kingdom Hybrid / WFH Options
Wearebasis
our purpose is to radically accelerate the clean-energy revolution - starting withsmarter, safer, and more efficient homes.We're focused on improving experiences for people through the application of technology, data, and a deep understanding of human behaviour. Why? As a society, we need to decarbonise and accelerate the transition to alternative energy sources. Existing solutions are expensive and disparate … led and puts people at its core, then we'd love to hear from you! The Role We're looking to grow our quality and test coverage in our data space. In this Intermediate-Senior role,you'll be helping us to implement processes, architectures and test coverage for validating our data backend with your excellent software skills … and guidance. We have an existing small team working and building the data backend and we need a QA engineer to work with the team to verify and validate this part of our product. Some of the things that you might be involved in include: Lead Data Quality Assurance: Develop and implement comprehensive testing for our dataMore ❯
Data Migration Lead - Dynamics D365 - CRM, BC, F&O/F&SCM Full Time Role - Fully Hybrid - Work from Home - with Occasional Travel to Client Sites - Fully Expensed Enterprise Solution Data Migrations encompassing D365 CRM, BC, F&O/F&SCM for Small to Large-Scale Migrations within an Agile Delivery Model, working both hands-on and in … a "Solution Lead Capacity". In this role, you will Lead and Shape the Key, Focussed, Data Migration Service Portfolio that clients expect! Location is UK - Office HQ is West Midlands with a UK Wide Clientele - Work from Home is Guaranteed - Remote First! Salary: To Start is at the £80,000pa mark + An Excellent Benefits Package + Remote … Security and DBS Vetting will be necessary on presentation of a Job Offer to the Successful Candidate. The person required for the is role: You will be a proven Data Migration Leader with expertise and deep understanding of data migrations to Dynamics 365 and other similar SaaS applications. You will have end-to-end experience of delivering Enterprise More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Glocomms
risk management for years to come. As a Quant Developer, you will work at the intersection of quantitative research, machine learning, and software engineering. You'll collaborate with quants, data scientists, and portfolio managers to design and implement scalable systems for dataingestion, model training, and real-time signal deployment. Key Responsibilities Design and develop robust, high … performance systems for AI/ML model development and deployment. Collaborate with quantitative researchers to translate trading strategies into production-ready code. Build and maintain data pipelines for structured and unstructured financial data. Implement backtesting frameworks and simulation environments. Optimise model inference and execution latency for real-time trading. Contribute to architectural decisions and technology stack selection for the … and deployment. Experience with quantitative finance, including time series analysis, alpha modelling, or risk analytics. Familiarity with cloud infrastructure (e.g., AWS, GCP) and containerisation (Docker, Kubernetes). Proficiency with data engineering tools Experience working in fast-paced, collaborative environments with agile methodologies. Nice to Have Prior experience in a hedge fund, prop trading firm, or investment bank. Exposure to More ❯
your recruiter to learn more. Base pay range Direct message the job poster from Synchro We are actively recruiting multiple KDB Developers at various levels for a fast-paced, data-driven organisation at the forefront of real-time analytics and high-performance computing. This organisation are a consultancy working with some of the leading names in the financial services … sector. The Role: As a KDB Developer, you will be responsible for designing, developing, and maintaining high-performance applications and data analytics solutions using kdb+/q. You’ll work closely with quants, traders, and data scientists to deliver scalable systems and actionable insights from large volumes of time-series data. Key Responsibilities: Design, implement, and optimise kdb+ …/q-based applications and data pipelines Work on real-time dataingestion, transformation, and analysis Collaborate with stakeholders to gather requirements and translate them into technical solutions Maintain and enhance existing codebases, ensuring high availability and performance Contribute to architectural decisions and best practices for kdb+ systems Troubleshoot and resolve production issues quickly and effectively Required More ❯
Get AI-powered advice on this job and more exclusive features. Millennium is a top tier global hedge fund with a strong commitment to leveraging innovations in technology and data science to solve complex problems for the business. Risk Technology team is looking for a Quantitative Developer who would leverage Python, Cloud infrastructure (AWS), and scientific frameworks to provide … data-driven, risk management solutions to Risk Managers, Portfolio Managers, Business Management. Responsibilities Work in the intersection of Portfolio Management, Risk Management and Quantitative Research to develop risk analytics solutions for Equity Derivatives businesses. Collaborate with risk management for rapid prototyping and delivery of solutions to enhancement risk metrics. Develop dataingestion pipelines and analytics the generated … information. Design and implement cloud-native, data-intensive applications that effectively leverage AWS solutions Mentor junior team members, fostering growth and collaboration within the team. Fit into the active culture of Millennium, judged by the ability to deliver timely solutions to Portfolio and Risk Managers Required Skills/Experience Minimum 5 years of experience using Python and scientific python More ❯
a unified, trusted and sustainable value chain that enables the most efficient production and logistics outcomes, while lowering the impact on the environment. We deliver actionable digital solutions and data insights that connect global supply chains, improving the safety, quality and sustainability of food and consumer goods, all in a way that's traceable and clear to the end … right price, creating more sustainable production and consumption outcomes. As part of our Food, Beverage and Consumer Goods team, you'll help our customers connect business processes and leverage data-driven technology for better visibility, agility and responsiveness. The purpose of this role is to complete technical tasks as required in the delivery of client implementations of Trade Promotion … using SQL Server stack 2012 or above, Knowledge of SQL Server Suite, V12. • Experience working with Developer tools such as Azure DevOps, Visual Studio Team Server • Kimball methodologies for Data Warehousing • Strong experience in developing dataingestion, data processing and analytical pipelines for big data, relational databases, • An analytical mindset, excellent communication and interpersonal skills More ❯
+ benefits + bonus potential About Us Come and be a part of The Investigo Group (TIG ), a dynamic coalition of cutting-edge tech firms specialising in Platform, Software, Data, AI and other bleeding-edge technology solutions. Our innovative prowess spans the globe while proudly hailing from the United Kingdom. The group is multi-functional with a large portfolio … for our community. The Consultancy side concentrates on expert support of our customers as well as specifically assigned individual deployments. Collaboraite is a bleeding-edge company that provides our Data and AI capability. A collaborative partner for designing user-centred secure data solutions to overcome operational hurdles, delivered through design thinking and agile coaching. Diversity, Equity, and Inclusion … inclusive environment where every voice matters, driving innovation and progress in our dynamic tech community. The group provides bespoke, secure, user-centric products fuelled by deep technical knowledge advanced data and analytical skills. We proudly stand as a global leader in this space, partnering with esteemed entities that require these advanced forward-thinking capabilities. These partnerships have been forged More ❯
Job Description Do you want to help us build a data-driven culture for digital product development? We have an amazing job opportunity for you! We are looking for an Analytics Engineer in our Data, Analytics & Business Improvement team. Come and help us build a robust behavioral analytics platform for our portfolio of consumer-facing LEGO websites, apps … and other owned experiences. Core Responsibilities Partner with multiple digital product teams to define & implement our data model, ensuring we have a scalable way of turning raw data from a range of sources into structured tables and data feeds that are aligned with the most common business questions requiring analysis and reporting: Work with Analytics Business Partners … and create a semantic & reporting layer for our suite of digital products, including the build and maintenance of DBT models for our behavioral analytics platform. Partner with the LEGO Data Office to develop standards and ways of working for data producers and data consumers across Digital Consumer Engagement and embed these across digital product teams (dataMore ❯