software architecture design principles. Payment domain expertise and ability to drive the team, integrations, migrations, create approach. Good to have experience in Angular, and front-end technologies. Understanding of Kafka, PCF, Integration patterns, Security standards, Concurrency and Multi-threading, Collections, PostgreSQL, Azure, Docker, Kubernetes Hands-on, high-energy, detail-oriented, proactive, and able to function independently under pressure. Strong More ❯
lakehouse resources. Proven experience in building MLOps pipelines, tracking model lifecycle, and integrating with modern ML frameworks (e.g., scikit-learn, XGBoost, TensorFlow). Exposure to streaming data pipelines (e.g., Kafka, Structured Streaming) and real-time analytics architectures is a strong plus. Experience implementing robust DevOps practices for data engineering: versioning, testing frameworks, deployment automation, monitoring. Familiarity with data governance More ❯
City of London, Greater London, UK Hybrid / WFH Options
Areti Group | B Corp
experience. Experience with Palantir Foundry (full training provided). Familiarity with AI/ML Ops pipelines , real-time analytics, or edge deployments. Big Data stack knowledge (e.g., Hadoop, Spark, Kafka). GenAI/LLM experience (e.g., AWS Bedrock, LangChain). Why this is a great move Mission & impact: Work on projects where data-driven decisions have real-world consequences. More ❯
City of London, Greater London, UK Hybrid / WFH Options
develop
a SaaS or cloud-first environment Hands-on experience with Docker and Kubernetes Familiarity with Domain-Driven Design (DDD) and CQRS patterns Exposure to serverless architectures and event streaming (Kafka, SNS/SQS, etc.) Whats on Offer Salary up to 120,000 (DOE) + annual bonus Hybrid working, typically 1 day per fortnight in London office 27 days holiday More ❯
City of London, Greater London, UK Hybrid / WFH Options
Futuria
Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, ApacheKafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks More ❯
to operate in a rapidly evolving environment Desirable Skills Financial markets knowledge, in particular fixed income. Exposure to interacting with C++ libraries Familiarity with JavaScript Messaging technologies such as Kafka or RabbitMQ. More ❯
or PostgreSQL. Knowledge of CI/CD principles and Agile/Scrum development practices. Excellent communication skills and a proactive, solution-focused mindset. Desirable extras: Experience with GraphQL , gRPC , Kafka , or Reactive Extensions (RX) . Exposure to the energy, commodities, or financial services sectors. This is an opportunity to contribute to an ambitious, data-driven organisation where your work More ❯
cloud-native services. Solid SQL skills and scripting experience (Python preferred). Expertise in data modelling and mapping complex transformations. Familiarity with ETL/ELT tools and streaming technologies (Kafka, Kinesis). Experience with CI/CD pipelines and infrastructure as code (Terraform or CloudFormation). Excellent communication skills able to translate business needs into technical solutions. A proactive More ❯
Built or deployed software used by enterprise customers Experience with applied machine learning or MLOps (not just academic or notebook-based work) Exposure to event-driven architectures, streaming systems (Kafka, etc.), or real-time data pipelines Frontend experience in React, TypeScript, or similar modern frameworks What We Offer Early-stage equity and the opportunity to shape foundational decisions High More ❯
experience (RHEL preferred) Scripting experience in Python, Ruby, or Bash BSc in Computer Science, Engineering, or related field Bonus: Ansible, Chef, Puppet, GCP, Prometheus, Grafana, Kibana, Datadog, Elasticsearch, Logstash, Kafka, Kubernetes This is a rare opportunity to join one of the most advanced systematic trading environments globally, building exposure to cutting-edge infrastructure, large-scale distributed systems, and next More ❯
City of London, Greater London, UK Hybrid / WFH Options
Bondaval
in using AI tools to improve workflow and product outcomes. Degree in Computer Science (or similar) from a good University highly desirable. Nice to Have: Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, Apache Flink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face More ❯
City of London, Greater London, UK Hybrid / WFH Options
Formula Recruitment
teams. Key Skills: Experience with Azure cloud data lakes and services (Data Factory, Synapse, Databricks). Skilled in ETL/ELT pipeline development and big data tools (Spark, Hadoop, Kafka). Strong Python/PySpark programming and advanced SQL with query optimisation. Experience with relational, NoSQL, and graph databases. Familiar with CI/CD, version control, and infrastructure as More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation
/Data Engineer & SysAdmin, you will: Operate and improve a critical hybrid setup spanning AWS services and on-premises resources. Maintain and enhance core data systems, including PostgreSQL databases, Kafka clusters , and data ingestion pipelines. Modernize infrastructure by migrating systems into AWS CDKbased TypeScript automation . Drive service reliability through improving monitoring and alerts for critical services feeding Kafka. More ❯
observability at global scale. What You'll Do: Own platform architecture for our next-gen ledger infrastructure Scale multi-region Kubernetes environments across cloud & on-prem Harden distributed systems (Kafka, Redis, CockroachDB) for global banking workloads Lead our AI-powered SRE approach: observability, remediation, and auto-response Enforce zero-trust, multi-tenant security and compliance (SOC2, ISO 27001) Define More ❯
Senior Java Developer - Java, Kotlin, Concurrency, Kafka, RDBMS, Unix, Linux, Front Office, Multi-threading Role Overview: I am seeking a Senior Java Developer to join a leading Investment Bank as part of their global front-office technology team. This is a hands-on position focused on working with a real-time, event-driven system that underpins our industry-leading … event processing and data management. Key Responsibilities: Develop, implement, and maintain highly performant, secure, and scalable Java applications. Java Server Side Concurrent Programming with Spring Boot (Core Spring) Leverage Kafka or other message queue systems to handle asynchronous processing. Design, optimize, and maintain MongoDB (NoSQL) database schemas. Craft and optimize advanced SQL queries for relational databases. Create comprehensive unit … full ownership of feature development from design through to production deployment. Collaborate effectively with cross-functional teams, balancing independence and teamwork. Key Skills: Java Kotlin Concurrency Message-driven architectures - Kafka, MQ SQL Queries Unix/Linux scripting Multithreading This is a full time role offering a salary of up to 130k. You will be required to attend the office More ❯
City of London, Greater London, UK Hybrid / WFH Options
Fruition Group
Go Developer will design, develop, and implement data-intensive applications across the full engineering lifecycle. Youll architect and deliver microservices-based systems using Go (Golang), AWS, Kubernetes, Docker, and Kafka, working closely with cross-functional teams to build scalable, reliable, and resilient platforms. Youll also play a key role in optimising system performance, improving reliability, and ensuring scalability, while … including system design and architecture Background in complex, large-scale, data-driven applications Product-focused approach, ideally within fast-paced tech organisations (start-ups or scale-ups) Knowledge of Kafka, Cassandra, gRPC, and microservices is a strong advantage Open-source contributions are beneficial If youre a Senior Go Developer looking for a challenging 6-month contract with a forward More ❯
structured, and streaming data Design and improve data models to support analytics and machine learning use cases Develop and support real-time data workflows using technologies such as ApacheKafka Monitor and ensure data quality, reliability, and security across the stack Contribute to the evolution of data engineering practices, tooling, and automation Collaborate with analytics, product, and data science … cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such as Airflow, dbt, or Snowflake Experience or strong interest in streaming technologies like ApacheKafka Interest in MLOps and modern data engineering best practices Why join: Youll be part of a company with a clear mission and strong data culture, joining a team that More ❯