ML Infra Engineer London (hybrid) £100-150k base + equity Built a GPU training stack from scratch before? Designed large-scale datapipelines end-to-end? Happy owning infra + ML systems in a fast, early startup? We’re an early-stage generative AI company developing state-of-the-art diffusion transformer models for 3D generation. We … re training from scratch, at scale. Backed by top-tier investors and fresh off a $7M seed, we’re now hiring an ML Infrastructure Engineer to build our data … and training stack from the ground up. Our customers include global companies in autonomous driving and robotics. The role Design and own the end-to-end datapipeline powering large-scale 3D pretraining: ingestion, storage, preprocessing, and streaming. Implement infrastructure-as-code, observability, CI/CD, and scalable deployment practices. Enable fast experimentation for diffusion transformer training More ❯
extension) Start: ASAP Rate: Market rate - Inside IR35 We’re looking for experienced PySpark + Fabric Developers to join a major transformation programme with a leading global financial data and infrastructure organisation. This is an exciting opportunity to work on cutting-edge data engineering solutions, driving innovation and performance at scale. Key Responsibilities Design, build, and … optimise datapipelines for both batch and streaming workloads. Develop and manage dataflows and semantic models to support analytics and reporting. Implement complex data transformations, aggregations, and joins with a focus on performance and reliability. Apply robust data validation, cleansing, and profiling techniques to maintain accuracy. Enforce role-based access, data masking … hands-on experience with PySpark (RDDs, DataFrames, Spark SQL). Proven ability to build and optimise ETL pipelines and dataflows. Familiar with Microsoft Fabric or similar lakehouse/data platform environments. Experience with Git, CI/CD pipelines, and automated deployment. Knowledge of market data, transactional systems, or financial datasets. Excellent communication skills and collaborative mindset. More ❯
Technical Delivery Manager, Delivery Manager We are working with a leading central government organisation to find a Technical Delivery Manager with a strong data delivery expertise and active SC Clearance. You will be required to lead cross functional teams to deliver a data focused technology project that supports high impact public services, ensuring outcomes are aligned … day inside ir35 Location - London - Once a week onsite Duration - 6 Months with the opportunity of extension Key Responsibilities and Key Skills Deliver end to end technical and data driven projects within a complex government environment Manage project scopes, timelines, risks, budgets, dependencies and reporting in line with government delivery frameworks Work closely with data engineers … architects, analysts and stakeholders to ensure effective data integration, transformation and governance across programmes Coordination of multi disciplinary agile and hybrid teams ensuring delivery milestones are met Produce detailed project documentation Ensure compliance with security, privacy and assurance processes required within SC Cleared environment Active SC Clearance (mandatory to start). Proven experience delivering data-focused More ❯
City of London, London, United Kingdom Hybrid/Remote Options
McCabe & Barton
A leading Financial Services client in London is undergoing a major data transformation and is hiring a Data Engineering Manager on a permanent basis. The role offers a base of £85,000, strong benefits, and flexible hybrid/remote working. Were looking for a hands-on leader from a forward-thinking, modern data environment … with a strong foundation in Data Engineering and experience managing a squad of 47 Data Engineers. Youll bring solid SQL skills, cloud expertise (preferably Azure), and familiarity with tools such as Snowflake, Databricks, ADF, Python, and GenAI. Youll manage a squad of engineers and play a key role in delivering scalable, modern data solutions … as part of a group-wide transformation. Requirements: Proven experience in fast-paced environments working with modern data architectures and tooling; Financial Services experience is a plus. Experience managing Data Engineering teams Strong SQL and cloud skills (Azure preferred) Familiar with modern data tools (e.g. Snowflake, Databricks, ADF, Python) Track record of building/ More ❯
AI-powered tools. Build and deploy LLM-based applications using frameworks such as LangChain, semantic search, RAG, and fine-tuning (SFT/RL). Develop backend logic and datapipelines to power intelligent automation and real-time processing. Manage infrastructure: relational and graph databases, containerization, CI/CD pipelines, and cloud platforms (AWS + Pulumi preferred). Write More ❯
AI-powered tools. Build and deploy LLM-based applications using frameworks such as LangChain, semantic search, RAG, and fine-tuning (SFT/RL). Develop backend logic and datapipelines to power intelligent automation and real-time processing. Manage infrastructure: relational and graph databases, containerization, CI/CD pipelines, and cloud platforms (AWS + Pulumi preferred). Write More ❯
Build and maintain Python-based tools supporting mortgage analytics, valuation, and reporting. • Develop production-grade applications and contribute to the migration of legacy models. • Identify automation opportunities across datapipelines, analytics, and reporting. • Collaborate with business teams to gather requirements and deliver scalable solutions. • Champion best practices in testing, documentation, and maintainability. Requirements • Strong proficiency in Python • Experience More ❯
Build and maintain Python-based tools supporting mortgage analytics, valuation, and reporting. • Develop production-grade applications and contribute to the migration of legacy models. • Identify automation opportunities across datapipelines, analytics, and reporting. • Collaborate with business teams to gather requirements and deliver scalable solutions. • Champion best practices in testing, documentation, and maintainability. Requirements • Strong proficiency in Python • Experience More ❯
will design, build, and test systems that process large sensor datasets, coordinate multiple assets, and present complex information clearly to operators. The work includes writing maintainable code across datapipelines, distributed systems, and intuitive interfaces. You’ll be trusted to take ownership, solve hard problems, and build with purpose. Required Skills Degree in Computer Science, Engineering, or a More ❯
City of London, London, United Kingdom Hybrid/Remote Options
develop
and maintain scalable full-stack features using TypeScript Develop intuitive, high-performance user interfaces with React/Next.js Implement robust APIs and backend logic with NestJS Work with datapipelines, integrations, and AI-driven services Write clean, maintainable, and well-tested code Collaborate closely with founders, designers, and product leads 💼 Candidate Requirements Strong experience with Typescript (personal projects More ❯
and maintain scalable full-stack features using TypeScript Develop intuitive, high-performance user interfaces with React/Next.js Implement robust APIs and backend logic with NestJS Work with datapipelines, integrations, and AI-driven services Write clean, maintainable, and well-tested code Collaborate closely with founders, designers, and product leads 💼 Candidate Requirements Strong experience with Typescript (personal projects More ❯
Proven success scaling security operations across geographies Exceptional analytical and decision-making abilities during BAU and incidents. Strong technical ability to understand and manage security tooling, integrations, and data pipelines. Core Values Love what you do: We show up each day ready to take on the world. Our passion and intensity set us apart and makes the difference More ❯
Data Scientist – B2B SaaS (Price Optimisation for Retail Banks) A fast-growing, profitable B2B SaaS company is looking for an ambitious Data Scientist to help advance its price optimisation platform used by … major retail banks. Their software guides mortgage and savings pricing decisions, delivering millions in additional revenue for clients. With a strong UK & Ireland client base and a rapidly expanding pipeline, this is a rare early-stage opportunity in a bootstrapped, high-growth business. This is an exciting time to join the company, with a small but growing set of … clients in the UK and Ireland. After recent successes, they have built a very promising pipeline of new clients, with revenues expected to increase significantly in the next year. The company is bootstrapped (no VC or angel investment) and is already very profitable. The Role You’ll play a key part in improving and building machine learning models and More ❯
Employment Type: Permanent
Salary: £45000 - £70000/annum Profit share and equity options
assistants - Summarization tools Apply prompt engineering, Retrieval-Augmented Generation (RAG), and context-aware pipelines to enhance model accuracy and relevance. Integrate AI models with enterprise systems, APIs, and data stores using Python, Java, or Node.js. Collaborate with architects to define scalable, secure, and cost-efficient AI service architectures. Implement AI/ML pipelines for training, validation, and deployment … model compression, quantization, and API optimization. Ensure compliance with AI ethics, security, and governance standards. Prepare and curate training datasets (structured/unstructured text, images, code). Apply data preprocessing, tokenization, and embedding generation techniques. Work with vector databases (e.g., Pinecone, Weaviate, FAISS, Chroma) for semantic search and retrieval. Partner with business stakeholders to identify and shape impactful … with cloud platforms (AWS, GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, datapipelines, and ETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, DataMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Luxoft
assistants - Summarization tools Apply prompt engineering, Retrieval-Augmented Generation (RAG), and context-aware pipelines to enhance model accuracy and relevance. Integrate AI models with enterprise systems, APIs, and data stores using Python, Java, or Node.js. Collaborate with architects to define scalable, secure, and cost-efficient AI service architectures. Implement AI/ML pipelines for training, validation, and deployment … model compression, quantization, and API optimization. Ensure compliance with AI ethics, security, and governance standards. Prepare and curate training datasets (structured/unstructured text, images, code). Apply data preprocessing, tokenization, and embedding generation techniques. Work with vector databases (e.g., Pinecone, Weaviate, FAISS, Chroma) for semantic search and retrieval. Partner with business stakeholders to identify and shape impactful … with cloud platforms (AWS, GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, datapipelines, and ETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, DataMore ❯
Duration: 6 Months Location: London Rate: £500-£550 per day (Inside IR35) Start Date: ASAP Are you an SC Cleared Databricks Engineer with a passion for building scalable data solutions? Do you have extensive experience in designing and optimising datapipelines, cloud-native platforms, and advanced analytics ecosystems? We are working with a leading multinational client … seeking a skilled Databricks Engineer to drive innovation and enable data-driven decision-making, advanced analytics, and AI capabilities. The successful client will be responsible for designing, developing, and optimising datapipelines and analytics solutions using Databricks within a secure environment Critical Skills Extensive experience with Databricks (Spark, Delta Lake, and MLflow). Proficiency in ETL … development and orchestration tools (DBT, Airflow, or similar). Hands-on experience with cloud platforms (AWS, Azure, or GCP). Solid understanding of SQL, Python, and PySpark for data processing. Familiarity with CI/CD pipelines and DevOps practices for data solutions. You will be collaborating with a variety of data specialists. The role More ❯
Bournemouth, Dorset, South West, United Kingdom Hybrid/Remote Options
Sanderson Recruitment
initiative within Financial Services aimed at delivering a unified, trusted view of customer data. We're seeking a highly skilled Lead Databricks Engineer to design and implement scalable datapipelines that form the backbone of our Lakehouse platform, enabling accurate analytics, reporting, and regulatory compliance. You'll work with cutting-edge technologies including Databricks , PySpark , and Azure Data Factory , applying best practices in data engineering and governance to support this critical programme. Lead Databricks Engineer: Key Responsibilities Build and maintain Databricks pipelines (batch and incremental) using PySpark and SQL. Orchestrate end-to-end workflows with Azure Data Factory . Develop and optimise Delta Lake tables (partitioning, schema evolution, vacuuming). Implement Medallion … Architecture (Bronze, Silver, Gold) for transforming raw data into business-ready datasets. Apply robust monitoring, logging, and error-handling frameworks. Integrate pipelines with downstream systems such as Power BI . Collaborate with analysts, business teams, and engineers to deliver consistent, well-documented datasets. Support deployments and automation via Azure DevOps CI/CD . Gather and refine requirements More ❯
is standard, and engineers have a real say in how things are built. The Role You'll join a small, experienced team working across backend development, cloud architecture, data flows and integrations. It's a great fit for someone who enjoys variety, ownership and working in a fast-moving, product-focused environment. Expect to: Build and maintain cloud … technical solutions Review code and uphold high engineering standards Improve deployment processes and contribute to cloud infrastructure Mentor teammates and contribute to architectural decisions Work closely with product, data and support teams Tech You'll Work With No one will have everything, but experience across several of the … following is ideal: Python-Django ideally Angular, TypeScript AWS (essential) REST APIs MySQL, InfluxDB Git, GitLab, CI-CD Nice-to-haves include IoT or payments integrations, datapipeline experience, or prior time in a startup environment. What You Can Expect Highly autonomous culture with a collaborative, supportive Lead Developer Fast-paced engineering with real ownership of features More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Cathcart Technology
is standard, and engineers have a real say in how things are built. The Role You'll join a small, experienced team working across backend development, cloud architecture, data flows and integrations. It's a great fit for someone who enjoys variety, ownership and working in a fast-moving, product-focused environment. Expect to: Build and maintain cloud … technical solutions Review code and uphold high engineering standards Improve deployment processes and contribute to cloud infrastructure Mentor teammates and contribute to architectural decisions Work closely with product, data and support teams Tech You'll Work With No one will have everything, but experience across several of the … following is ideal: Python-Django ideally Angular, TypeScript AWS (essential) REST APIs MySQL, InfluxDB Git, GitLab, CI-CD Nice-to-haves include IoT or payments integrations, datapipeline experience, or prior time in a startup environment. What You Can Expect Highly autonomous culture with a collaborative, supportive Lead Developer Fast-paced engineering with real ownership of features More ❯
DATA SCIENTIST - E-DV CLEARED NEW CONTRACT OPPORTUNITY FOR A DATA SCIENTIST IN LONDON WITH ENHANCED DV CLEARANCE New contract opportunity with a leading consultancy for a Data Scientist London based, 3-4 days on site Enhanced DV Clearance required to start 12 month rolling contract Daily rate up to £850 To apply, email … About We're seeking an experienced Data Scientist with enhanced Developed Vetting (DV) clearance to join a leading consultancy supporting mission-critical UK Government projects . This is a hands-on, high-impact role within a secure environment, where your analytical expertise will directly contribute to national operational outcomes. The Role As a Data Scientist, you … ll work within a multidisciplinary team to extract insight from complex, high-volume datasets. You'll design and implement advanced analytical and machine learning models, develop robust datapipelines, and deliver actionable intelligence to inform decision-making at the highest levels. This role offers the opportunity to apply your technical skills to highly operational challenges in a fast More ❯
teams. Cloud Services: Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar … big data platforms for processing large datasets, building datapipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools More ❯
CONTRACT PALANTIR DATA ENGINEER - DV CLEARED NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A HIGHLY SECURE DELIVERY PROGRAMME FOR A DV CLEARED PALANTIR FOUNDRY DATA ENGINEER. Contract opportunity for a Palantir Data Engineer to support cutting-edge National Security projects Up to £650 per day (Outside IR35) DV clearance is essential Based full-time … using Foundry tools like Workshop Strong communication skills to collaborate with technical and military/non-technical stakeholders Experience integrating data from multiple sources into Foundry via Pipeline Builder and custom code Comfortable working on-site within a secure environment and at pace with end-users TO BE CONSIDERED... . Please either apply by clicking online or … consent for us to process and submit your application to our client in conjunction with this vacancy only. KEY SKILLS: PALANTIR/FOUNDARY/DATA ENGINEER/PIPELINE BUILDER/ONTOLOGY/PYTHON/TYPESCRIPT/FULL STACK/DV CLEARED/LONDON/CONTRACT More ❯
City of London £90,000 As an Agentic AI Engineer, you’ll design and build intelligent systems that think, reason and act. Solve hard problems at scale, integrate data, AI and automation to create self-improving systems that deliver measurable business value. You’ll work with as part of a Data and AI hands-on engineering … environments using containerisation, microservices and event-driven design Translate client challenges into engineered systems that reason, learn and adapt – improving efficiency, resilience and decision-making Bridge AI and data architecture to ensure agents are grounded in solid data foundations, scalable pipelines and secure governance Automate end-to-end with CI/CD pipelines, observability and responsible … from AI-enabled experiments to production-grade AI-native enterprise We need: Strong coding skills in Python and experience with agentic and LLM framework Hands-on experience in data engineering and architecture – APIs, streaming, ETL or data mesh principles Ability to work across multi-cloud and data platform ecosystems (AWS, Azure, GCP, Databricks, Snowflake More ❯
City of London £90,000 As an Agentic AI Engineer, you’ll design and build intelligent systems that think, reason and act. Solve hard problems at scale, integrate data, AI and automation to create self-improving systems that deliver measurable business value. You’ll work with as part of a Data and AI hands-on engineering … environments using containerisation, microservices and event-driven design Translate client challenges into engineered systems that reason, learn and adapt – improving efficiency, resilience and decision-making Bridge AI and data architecture to ensure agents are grounded in solid data foundations, scalable pipelines and secure governance Automate end-to-end with CI/CD pipelines, observability and responsible … from AI-enabled experiments to production-grade AI-native enterprise We need: Strong coding skills in Python and experience with agentic and LLM framework Hands-on experience in data engineering and architecture – APIs, streaming, ETL or data mesh principles Ability to work across multi-cloud and data platform ecosystems (AWS, Azure, GCP, Databricks, Snowflake More ❯
We’re seeking a Lead Data Engineer to design and deliver scalable data and AI solutions across internal and client projects. You’ll own the full data lifecycle — from ingestion and processing to deployment and monitoring of machine learning models — while leading a small, high-performing team. What You’ll Do Build and maintain … datapipelines and cloud infrastructure (AWS/GCP). Develop AI-driven applications including LLM tools, RAG pipelines, and automation solutions. Lead on client communication , project delivery, and stakeholder alignment. Collaborate with Data Science and Analytics teams to improve data access, automation, and governance. Drive innovation through proofs-of-concept , model monitoring , and adoption … of emerging AI tech. What We’re Looking For 5+ years in data engineering, with team or project leadership experience. Advanced Python (Pandas, PyTorch, TensorFlow, Scikit-learn). Strong AWS/GCP , MySQL , and CI/CD experience; Docker/Kubernetes a plus. Excellent communication, organisation, and problem-solving skills. Passion for innovation and ethical, scalable AI. Nice More ❯