observability frameworks, including lineage tracking, SLAs, and data quality monitoring. Familiarity with modern data lake table formats such as Delta Lake, Iceberg, or Hudi. Background in stream processing (Kafka, Flink, or similar ecosystems). Exposure to containerisation and orchestration technologies such as Docker and Kubernetes. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
london, south east england, united kingdom Hybrid / WFH Options
BondAval
on when needed. Nice to haves Experience leading technical discovery or architecture definition in a scaling SaaS or fintech environment. Familiarity with event-driven or streaming architectures (Kafka, ApacheFlink, etc.). Practical exposure to AI/LLM orchestration frameworks or fine-tuning workflows. Experience designing developer tools, data platforms, or intelligent systems. Interest in or experience mentoring engineers More ❯
solution development. What You'll Do This role will allow you to master the three pillars of every organisation: Software Engineering, Infrastructure, and Data. Software Engineering: Develop microservices, libraries, Flink jobs, data pipelines, and Kubernetes controllers. Stakeholder Collaboration: Work closely with both technical and non-technical stakeholders to define, design, and implement solutions within our Data Platform. Trusted relationship More ❯
london, south east england, united kingdom Hybrid / WFH Options
Compare the Market
financial services Exposure to Databricks, container orchestration (e.g. Kubernetes), or workflow engines (e.g. Argo, Airflow) Familiarity with real-time model deployment, streaming data, or event-driven systems (e.g. Kafka, Flink) Interest in MLOps, model governance, and responsible AI practices Understanding of basic model evaluation, drift detection, and monitoring techniques Why Join Us? You'll work on meaningful problems using More ❯
and constructive feedback to foster accountability, growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments: Proficient with large-scale More ❯
Java, data structures and concurrency, rather than relying on frameworks such as Spring. You have built event-driven applications using Kafka and solutions with event-streaming frameworks at scale (Flink/Kafka Streams/Spark) that go beyond basic ETL pipelines. You know how to orchestrate the deployment of applications on Kubernetes, including defining services, deployments, stateful sets etc. More ❯
london, south east england, united kingdom Hybrid / WFH Options
Quant Capital
Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you will need More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Quant Capital
Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you will need More ❯
london, south east england, united kingdom Hybrid / WFH Options
Quant Capital
Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you will need More ❯
required in the role; we are happy to support your learning on the job, but prior experience is a plus: Experience with large-scale data processing frameworks (e.g., Spark, Flink). Experience with time series analysis, anomaly detection, or graph analytics in a security context. Proficiency in data visualization tools and techniques to effectively communicate complex findings. A basic More ❯
to cross-functional teams, ensuring best practices in data architecture, security and cloud computing Proficiency in data modelling, ETL processes, data warehousing, distributed systems and metadata systems Utilise ApacheFlink and other streaming technologies to build real-time data processing systems that handle large-scale, high-throughput data Ensure all data solutions comply with industry standards and government regulations … not limited to EC2, S3, RDS, Lambda and Redshift. Experience with other cloud providers (e.g., Azure, GCP) is a plus In-depth knowledge and hands-on experience with ApacheFlink for real-time data processing Proven experience in mentoring and managing teams, with a focus on developing talent and fostering a collaborative work environment Strong ability to engage with More ❯
at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise. Key responsibilities: Design and implement high-throughput data streaming solutions using Kafka, Flink, or Confluent. Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles. Develop tools and frameworks for data governance, privacy, and quality monitoring … data use cases. Contribute to an engineering culture that values testing, peer reviews, and automation-first principles. What You'll Bring Strong experience in streaming technologies such as Kafka, Flink, or Confluent. Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals. Proven ability to design, deploy, and scale production-grade data platforms and backend More ❯