london, south east england, united kingdom Hybrid / WFH Options
Formula Recruitment
Engineer to join the team and contribute to a cutting-edge platform for analytics and machine learning. They are looking for a skilled data engineer with experience in Databricks, Spark, and Python, who can deliver high-impact data products. This role offers the opportunity to work alongside an exciting, collaborative team with a clear roadmap for growth and the … wider technology teams. Key Skills: Experience with Azure cloud data lakes and services (Data Factory, Synapse, Databricks). Skilled in ETL/ELT pipeline development and big data tools (Spark, Hadoop, Kafka). Strong Python/PySpark programming and advanced SQL with query optimisation. Experience with relational, NoSQL, and graph databases. Familiar with CI/CD, version control, and More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Formula Recruitment
Engineer to join the team and contribute to a cutting-edge platform for analytics and machine learning. They are looking for a skilled data engineer with experience in Databricks, Spark, and Python, who can deliver high-impact data products. This role offers the opportunity to work alongside an exciting, collaborative team with a clear roadmap for growth and the … wider technology teams. Key Skills: Experience with Azure cloud data lakes and services (Data Factory, Synapse, Databricks). Skilled in ETL/ELT pipeline development and big data tools (Spark, Hadoop, Kafka). Strong Python/PySpark programming and advanced SQL with query optimisation. Experience with relational, NoSQL, and graph databases. Familiar with CI/CD, version control, and More ❯
technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader data strategy … ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering, or a More ❯
technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader data strategy … ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering, or a More ❯
london (city of london), south east england, united kingdom
Mondrian Alpha
technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader data strategy … ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering, or a More ❯
be on designing and maintaining the data pipelines that feed large-scale ML and research workflows. Day-to-day responsibilities include: Building and maintaining data pipelines using Python, SQL, Spark, and Google Cloud technologies (BigQuery, Cloud Storage). Ensuring pipelines are robust, reliable, and optimised for AI/ML use cases. Developing automated tests, documentation, and monitoring for production … best practices, and continuously improving performance and quality. Tech Stack & Skills Core Skills: Strong experience with Python and SQL in production environments Proven track record developing data pipelines using Spark, BigQuery, and cloud tools (preferably Google Cloud) Familiarity with CI/CD and version control (git, GitHub, DevOps workflows) Experience with unit testing (e.g., pytest) and automated quality checks More ❯
team leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and … role) Mentor and upskill engineers, define coding standards, and embed engineering excellence across the team. What’s Expected Proven experience delivering end-to-end data pipelines in Databricks and Spark environments. Strong understanding of data modelling, schema evolution, and data contract management. Hands-on experience with Kafka, streaming architectures, and real-time processing principles. Proficiency with Docker, Terraform, and More ❯
team leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and … role) Mentor and upskill engineers, define coding standards, and embed engineering excellence across the team. What’s Expected Proven experience delivering end-to-end data pipelines in Databricks and Spark environments. Strong understanding of data modelling, schema evolution, and data contract management. Hands-on experience with Kafka, streaming architectures, and real-time processing principles. Proficiency with Docker, Terraform, and More ❯
london (city of london), south east england, united kingdom
develop
team leadership and upskilling responsibilities. Key Responsibilities Build and maintain Databricks Delta Live Tables (DLT) pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and … role) Mentor and upskill engineers, define coding standards, and embed engineering excellence across the team. What’s Expected Proven experience delivering end-to-end data pipelines in Databricks and Spark environments. Strong understanding of data modelling, schema evolution, and data contract management. Hands-on experience with Kafka, streaming architectures, and real-time processing principles. Proficiency with Docker, Terraform, and More ❯
Milton Keynes, Buckinghamshire, South East, United Kingdom
InfinityQuest Ltd,
business stakeholders to understand data requirements. Optimize data workflows for performance and reliability. Ensure data quality, integrity, and security across systems. Work with large datasets using tools like Hadoop, Spark, and SAS. Integrate data from various sources including IBM Mainframe systems. Troubleshoot and resolve data-related issues efficiently. Required Skills & Experience:- Proven experience as a Data Engineer with a … foundation in data analysis. Expert-level proficiency in SAS for data manipulation and reporting. Working knowledge of IBM Mainframe systems and data structures. Advanced programming skills in Hadoop, SQL, Spark, and Python. Strong problem-solving and analytical skills. Experience with data modeling, warehousing, and performance tuning. Familiarity with Santander UK systems and processes is a strong advantage. Preferred Qualifications More ❯
london, south east england, united kingdom Hybrid / WFH Options
Attis
on engineering role with a strong focus on data, customer collaboration, and real-world outcomes. Essential Requirements Proficiency in Python and/or TypeScript Experience with distributed systems (e.g. Spark, Kafka, Hadoop) Background in AI/ML , ideally with exposure to GenAI or LLMs SC Clearance Strong communication skills and ability to gather requirements from non-technical stakeholders Bachelor … SEO Keywords for Search: Forward Deployed Engineer, FDE, Software Engineer, Data Engineer, Machine Learning Engineer, AI Engineer, GenAI, LLM, Python Developer, TypeScript Developer, Full Stack Engineer, Distributed Systems Engineer, Spark, Kafka, Hadoop, React Developer, Enterprise Software Engineer, Consulting Engineer, Analytics Engineer, Technical Consultant, Palantir Foundry, AIP, Hybrid Software Engineer, London Tech Jobs, Mission-Driven Engineering Roles, FDSE, Python, Java More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Attis
on engineering role with a strong focus on data, customer collaboration, and real-world outcomes. Essential Requirements Proficiency in Python and/or TypeScript Experience with distributed systems (e.g. Spark, Kafka, Hadoop) Background in AI/ML , ideally with exposure to GenAI or LLMs SC Clearance Strong communication skills and ability to gather requirements from non-technical stakeholders Bachelor … SEO Keywords for Search: Forward Deployed Engineer, FDE, Software Engineer, Data Engineer, Machine Learning Engineer, AI Engineer, GenAI, LLM, Python Developer, TypeScript Developer, Full Stack Engineer, Distributed Systems Engineer, Spark, Kafka, Hadoop, React Developer, Enterprise Software Engineer, Consulting Engineer, Analytics Engineer, Technical Consultant, Palantir Foundry, AIP, Hybrid Software Engineer, London Tech Jobs, Mission-Driven Engineering Roles, FDSE, Python, Java More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Attis
on engineering role with a strong focus on data, customer collaboration, and real-world outcomes. Essential Requirements Proficiency in Python and/or TypeScript Experience with distributed systems (e.g. Spark, Kafka, Hadoop) Background in AI/ML , ideally with exposure to GenAI or LLMs SC Clearance Strong communication skills and ability to gather requirements from non-technical stakeholders Bachelor … SEO Keywords for Search: Forward Deployed Engineer, FDE, Software Engineer, Data Engineer, Machine Learning Engineer, AI Engineer, GenAI, LLM, Python Developer, TypeScript Developer, Full Stack Engineer, Distributed Systems Engineer, Spark, Kafka, Hadoop, React Developer, Enterprise Software Engineer, Consulting Engineer, Analytics Engineer, Technical Consultant, Palantir Foundry, AIP, Hybrid Software Engineer, London Tech Jobs, Mission-Driven Engineering Roles, FDSE, Python, Java More ❯
london (city of london), south east england, united kingdom
Aurum Search Limited
Our client is a leading hedge fund in London, leveraging cutting-edge technology and data to drive investment decisions across diverse asset classes. They're looking for a skilled Data Engineer to play a crucial role in building and optimizing More ❯
Our client is a leading hedge fund in London, leveraging cutting-edge technology and data to drive investment decisions across diverse asset classes. They're looking for a skilled Data Engineer to play a crucial role in building and optimizing More ❯
Our client is a leading hedge fund in London, leveraging cutting-edge technology and data to drive investment decisions across diverse asset classes. They're looking for a skilled Data Engineer to play a crucial role in building and optimizing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hirexa Solutions UK
Title: Big Data with Java Location: London, UK(Hybrid) Employment Type: Contract Job Description: Java - Must have Big Data – must have Interview includes coding test. Job Description: Scala/Spark • Good Big Data resource with the below Skillset: Java Big data technologies. • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data … processing platform (Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written and verbal communication • A history of delivering against agreed objectives • Ability to multi-task and work under pressure • Demonstrated problem solving and decision-making skills • Excellent analytical and process-based skills, i.e. process flow diagrams, business modelling More ❯