Senior Data Scientist – Biomedical AI 📍 London (Hybrid, 4 days onsite) I’m working with a fast-growing company at the cutting edge of AI-driven biomedical research. They’re hiring a Senior Data Scientist with strong biological or biomedical expertise to design and deliver next-generation machine learning solutions for life sciences. You’ll apply advanced … ML to complex datasets - genomics, proteomics, imaging, clinical data, accelerating work in drug discovery, biomarker development, and health data analysis. What You’ll Do Build and deploy ML models across diverse biomedical datasets Collaborate with biologists, clinicians, and engineers to turn scientific questions into AI solutions Develop robust datapipelines and reproducible research workflows … Support strategic data science initiatives and mentor junior colleagues Communicate findings across teams and represent the company externally What They’re Looking For Strong academic background in Bioinformatics, Computational Biology, Data Science, or related field (advanced degree preferred) Hands-on experience applying ML to biological or biomedical data Proficient in Python and modern ML More ❯
sector experience Understanding of cloud security, governance and best practice Nice to Have Exposure to ECS, EKS or container workloads Experience working with event driven or datapipeline architectures Knowledge of CloudWatch, Grafana or other observability tools Apply now or email for more information. More ❯
sector experience Understanding of cloud security, governance and best practice Nice to Have Exposure to ECS, EKS or container workloads Experience working with event driven or datapipeline architectures Knowledge of CloudWatch, Grafana or other observability tools Apply now or email dom@briodigital.io for more information. More ❯
CD pipelines and container orchestration at scale Modernising Linux-based deployment and runtime systems used by trading and quant teams Automating everything - from developer tooling to monitoring and datapipelines Collaborating directly with traders, quants, and developers in a fast-paced, low-latency environment What we’re looking for: 5+ years’ experience in DevOps, Systems, or Platform Engineering More ❯
CD pipelines and container orchestration at scale Modernising Linux-based deployment and runtime systems used by trading and quant teams Automating everything - from developer tooling to monitoring and datapipelines Collaborating directly with traders, quants, and developers in a fast-paced, low-latency environment What we’re looking for: 5+ years’ experience in DevOps, Systems, or Platform Engineering More ❯
and maintain backend control software for advanced scientific systems. Collaborate with scientists, hardware, and systems engineers to architect and optimise performance. Develop tools and interfaces to manage experiments, datapipelines, and control sequences. Apply strong software hygiene, testing and CI/CD practices. Contribute to design reviews and mentor junior engineers. Tech Environment Languages: Python (core), Rust (or More ❯
and development approach, with a strong emphasis on team collaboration. This is a stimulating environment where systems must operate in real-time, requiring robust event-driven architectures, streaming datapipelines, and reactive programming. You’ll tackle complex scalability challenges across distributed systems, ensuring speed and reliability under heavy user loads. Security and compliance are central to the platform More ❯
and development approach, with a strong emphasis on team collaboration. This is a stimulating environment where systems must operate in real-time, requiring robust event-driven architectures, streaming datapipelines, and reactive programming. You’ll tackle complex scalability challenges across distributed systems, ensuring speed and reliability under heavy user loads. Security and compliance are central to the platform More ❯
and development approach, with a strong emphasis on team collaboration. This is a stimulating environment where systems must operate in real-time, requiring robust event-driven architectures, streaming datapipelines, and reactive programming. You'll tackle complex scalability challenges across distributed systems, ensuring speed and reliability under heavy user loads. Security and compliance are central to the platform More ❯
features. Key Responsibilities: Build and maintain end-to-end product features used by enterprise customers. Develop modern, responsive interfaces using React/Next.js . Design and optimize APIs, datapipelines, and integrations using Python (FastAPI/Django) . Collaborate with design, product, and AI teams to deliver new functionality quickly and reliably. Contribute to architectural decisions, system scalability More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Explore Group
features. Key Responsibilities: Build and maintain end-to-end product features used by enterprise customers. Develop modern, responsive interfaces using React/Next.js . Design and optimize APIs, datapipelines, and integrations using Python (FastAPI/Django) . Collaborate with design, product, and AI teams to deliver new functionality quickly and reliably. Contribute to architectural decisions, system scalability More ❯
deploying scientific software using common CI and containerization solutions. Enjoy staying up to date with current software development best practices and tools, e.g. AI coding assistants or modern datapipelines and scheduling solutions. Enjoy diving into new problems with a continuous learning attitude, absorbing relevant domain knowledge. Additional experience with scientific C++ and Fortran libraries, HPC technologies, cloud More ❯
and optimal ‘next task’ Build fully-automated pipelines for dictionary building; including span identification, word sense distribution, and sense granularity decision Work with a vast amount of unique data - we have data from over 1M language tests, including text and voice data Create brand new dictionaries and train models to determine the difficulty of … words, idioms, phrasal verbs etc. Analyse large amounts of diverse data - including data from every movie, book, and song Work in a cross-functional team and communicate with backend engineers and product managers Create new types of tests for language learners to gather more test results, analyse them, and build prediction models based on these results … Optimise and fine-tune machine learning models for performance, scalability, and accuracy Essential skills 🙏 Strong expertise in NLP Complete end-to-end experience - from finding and cleaning data all the way to monitoring models in production Strong understanding of neural networks, CNNs, RNNs, LSTMs, and transformers Experience building automated datapipelines Hands-on experience with LLM More ❯
and optimal ‘next task’ Build fully-automated pipelines for dictionary building; including span identification, word sense distribution, and sense granularity decision Work with a vast amount of unique data - we have data from over 1M language tests, including text and voice data Create brand new dictionaries and train models to determine the difficulty of … words, idioms, phrasal verbs etc. Analyse large amounts of diverse data - including data from every movie, book, and song Work in a cross-functional team and communicate with backend engineers and product managers Create new types of tests for language learners to gather more test results, analyse them, and build prediction models based on these results … Optimise and fine-tune machine learning models for performance, scalability, and accuracy Essential skills 🙏 Strong expertise in NLP Complete end-to-end experience - from finding and cleaning data all the way to monitoring models in production Strong understanding of neural networks, CNNs, RNNs, LSTMs, and transformers Experience building automated datapipelines Hands-on experience with LLM More ❯
BluAge modernization tools, Java Spring Boot, and Apache Kafka. The ideal candidate will play a key role in modernizing legacy applications, building scalable microservices, and implementing real time data pipelines. Must have technical skills: AWS BluAge Kafka Java Springboot Nice to have skills: Good to have some knowledge on banking domain Responsibilities: Solution design To Involve in all More ❯
A fast-growing macro hedge fund is seeking a technically sharp and curious individual to join a high-impact team at the intersection of technology, data, and investment operations. Operating in a fast-paced, high-performance environment, the fund values autonomy, initiative, and smart problem-solving. This is a unique opportunity to work across all areas of the … s technology infrastructure as it scales. About the Role This is a broad, hands-on position where you'll work directly with investment and operational teams to streamline data flows, automate processes, and enhance system efficiency. From day one, you'll be given real ownership of projects that span infrastructure, reporting, and tooling-making a tangible impact on … the fund's performance and scalability. Key Responsibilities Build and maintain robust datapipelines and automations using Python, SQL, and Excel/VBA Integrate external data feeds and APIs into internal systems and databases Develop dashboards and reporting tools to support trading, risk, and operations Manage infrastructure hosted on Microsoft Azure, including VMs and storage Use More ❯
have award-winning solutions in the fast-moving digital advertising space with proprietary tech, media and petabyte scale data. The Role We're seeking a well-rounded Senior Data Engineer to join a Product, Data and Engineering org of 50+ people based out of London working end-to-end across multiple products covering audience insights, analytics … and advertising. You'll be relied upon to design, develop and maintain datapipelines to enhance and support their products as well as design & deliver new innovations for growth including Agentic development. Technology/Skills requirements Agentic development frameworks Automations SQL ELT/ETL Data Modelling dbt models Version control (Git) Data processing and … orchestration using Airflow or similar Google BigQuery Cloud based services: GCP (preferred) or AWS About You 4 years+ experience as a Data Engineer solving complex and scaled data challenges Someone who can work in and lead cross-functional scrum teams Problem-solver mindset Excellent communicator able to distil down complex matters to various stakeholders Willing and More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Diagonal recruitment
have award-winning solutions in the fast-moving digital advertising space with proprietary tech, media and petabyte scale data. The Role We're seeking a well-rounded Senior Data Engineer to join a Product, Data and Engineering org of 50+ people based out of London working end-to-end across multiple products covering audience insights, analytics … and advertising. You'll be relied upon to design, develop and maintain datapipelines to enhance and support their products as well as design & deliver new innovations for growth including Agentic development. Technology/Skills requirements Agentic development frameworks Automations SQL ELT/ETL Data Modelling dbt models Version control (Git) Data processing and … orchestration using Airflow or similar Google BigQuery Cloud based services: GCP (preferred) or AWS About You 4 years+ experience as a Data Engineer solving complex and scaled data challenges Someone who can work in and lead cross-functional scrum teams Problem-solver mindset Excellent communicator able to distil down complex matters to various stakeholders Willing and More ❯
York, England, United Kingdom Hybrid/Remote Options
Uneek Global
Senior Data Engineer Location: York (Hybrid) Salary: £45,000 - £60,000 (DOE) Uneek are working with a TECH FOR GOOD business who is looking for an experienced Data Engineer to join their growing Data Team. If you love working with modern cloud tech and turning raw data into valuable insights, this could … be your next move! Highlights: • Building and maintaining top-quality datapipelines using Snowflake, DBT, and Fivetran. • Making sure data is accurate, complete, and ready for action. • Teaming up with analysts and stakeholders to deliver MI and operational reporting. • Solving problems, improving processes, and sharing your know-how. Requirements: • Has 2+ years experience • Proficiency with DBT More ❯
Company A global digital entertainment business with millions of active users worldwide. Known for creating immersive experiences and building strong communities, the company continues to invest heavily in data-driven marketing. You’ll be joining a dedicated Analytics, Data Science & Engineering hub that shapes strategy across marketing, customer lifecycle, and product. The Role This is a … guiding spend and targeting decisions across multiple channels. Alongside delivering hands-on analysis, you’ll mentor a small team, partner with senior stakeholders, and ensure best practice in data-driven marketing decisions. Key Responsibilities Lead and mentor a small marketing analytics team, setting best practices and ways of working. Own full-funnel, cross-channel analysis covering acquisition, engagement … and retention. Improve campaign performance measurement through attribution modelling, incrementality testing, and media mix modelling. Partner with Data Science to develop predictive models (e.g. LTV, churn) to inform budget allocation. Build and maintain marketing forecast models to optimise spend and ROI. Conduct robust statistical analysis to improve campaign testing methodologies. Deliver clear reporting and actionable insights to marketing More ❯
We’re partnering with a leading entertainment company seeking to hire a Lead Data BI Analyst to play a key role in their data operations and business intelligence initiatives. Key Responsibilities: Oversee the daily operations of BI datapipelines, reporting platforms, and cloud-based ETL/ELT workflows (AWS, Redshift, Glue, Airflow). Develop … track system health, availability, and operational performance. Manage and resolve Level 2 incidents Conduct root cause analyses and implement corrective actions to enhance platform stability and prevent Supervise data validation and ensure adherence to governance frameworks. Automate key operational and support activities such as monitoring, alerting, and reporting to increase reliability and efficiency. Collaborate with cross-functional teams … DevOps, Data Engineering, AI) to coordinate system fixes, enhancements, and improvements. Essential Skills & Experience: A minimum of 3 years in AWS-based BI/Data Engineering production support. AWS BI Stack: Redshift, Glue, Airflow, S3, Step Functions. Experience in data modeling, ETL pipelines, and reverse engineering. Proficiency in Power BI (preferred), Business Objects, or More ❯
Data Analyst - DV Cleared | On-site | Hereford | £50,000–£70,000 | DV Clearance Required We are looking for multiple Data Analysts to join a growing technical team, working on complex data-driven projects that make a real difference. You’ll be part of a close-knit engineering function helping to design, build, and maintain … robust datapipelines and applications that deliver value from day one. What you’ll be doing: Designing and developing bespoke end-to-end data projects Building and maintaining efficient datapipelines Working closely with data scientists, analysts, and engineers to ensure seamless data integration Supporting front-end application development and … mitigating potential issues to ensure system reliability What we’re looking for: Strong Python skills (or similar programming experience) Solid understanding of relational databases Awareness of APIs and data integration techniques Experience working with diverse stakeholders to capture and refine requirements Exposure to front-end application design and implementation You’ll need: DV clearance (Developed Vetting) or the More ❯
m looking for a Senior Back End Developer for an electrical vehicle charger company (founded 1990) based in Nottingham. You will create an ambitious Cloud platform that sends datapipelines to customers. Location: Nottingham, ~5 days on-site to start with then 1-2 days on-site per week. 2:30pm finish on Fridays. The requirements include: Golang More ❯
and development approach, with a strong emphasis on team collaboration. This is a stimulating environment where systems must operate in real-time, requiring robust event-driven architectures, streaming datapipelines, and reactive programming. You'll tackle complex scalability challenges across distributed systems, ensuring speed and reliability under heavy user loads. Security and compliance are central to the platform More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Uniting Ambition
the DNA of the platform What We’re Looking For Proven track record designing or architecting distributed, high-volume systems Experience with event-driven architectures and high-throughput datapipelines (Kafka) Strong knowledge of at least one of: C#, .NET, Golang, or TypeScript , plus SQL/databases Ability to balance hands-on technical leadership with high-level solution More ❯