Data Engineer - Real-Time Streaming Systems

Data Engineer – Real-Time Streaming Systems

Overview

🏢 Company | Global trading & technology group

👤 Job | Data Engineer – Real-Time Streaming Systems

🎯 Impact | Low-latency data for decision-making, analytics & automated workflows

📏 Team | Small senior engineering group

🌟 Stack | Azure, Kafka, Snowflake, Python/Scala/Java

📍 Location | London

💻 Hybrid | Office (Hybrid)

💰 Offer | Competitive salary + performance bonus

💎 Benefits | Strong learning budget, certifications, conferences, full health package

The Work

You enjoy building systems where speed and reliability actually matter.

Here, you’ll take fast-moving external feeds, normalise them, and turn them into high-quality real-time datasets used across engineering, analytics and automation. If you like low-latency engineering, repeatable pipelines and being the reason things “just work”, this will suit you.

You’ll get:

  • Real ownership over a streaming platform used across the business.
  • Direct collaboration with engineers, data teams, and technical end-users.
  • A modern environment: Azure, streaming tech, Snowflake, Databricks, CI/CD, TDD.
  • Space to influence good engineering habits: automation, observability, data contracts.
  • Long-term investment in a growing technology function.

What you’ll be doing

You’ll:

  • Build and operate a high-availability, low-latency streaming pipeline.
  • Pull external feeds and standardise them into clean, structured datasets.
  • Add reliability features: retries, validation, redundancy, graceful failover.
  • Apply testing and automation across ingestion, transformation and storage.
  • Define APIs, schemas and platform patterns other teams can depend on.
  • Build monitoring for latency, quality and system health — and use it to drive improvements.
  • Work closely with engineers, data scientists and analysts to wire your data into models and systems.
  • Keep the platform documented and continually improved.

You’ll also help steer direction:

  • Prioritise data sources that meaningfully improve decision-making and analytics.
  • Evaluate new external providers and integrate them safely and at scale.
  • Contribute to new data products: curated datasets, feature stores, and real-time decision APIs.

What you’ll bring

You probably have:

  • 3+ years in data engineering or real-time systems.
  • Experience with high-frequency or event-driven pipelines.
  • Strong coding ability in Python, Scala or Java .
  • Streaming expertise with Kafka / Confluent .
  • Practical experience with Azure (ADLS Gen2, Event Hubs, Databricks/Synapse, Functions, Data Factory, Key Vault).
  • Solid experience with Snowflake in production.
  • Good engineering fundamentals: tests, CI/CD, automation, version control.

Nice to have:

  • Airflow or similar orchestration tools.
  • Data quality frameworks (e.g., Great Expectations).
  • Terraform/Bicep for IaC.
  • Experience in environments where milliseconds and data accuracy matter.

You’re the type who:

  • Spots anomalies quickly and goes hunting for root causes.
  • Communicates clearly in technical and cross-functional discussions.
  • Stays calm under pressure and fixes systems properly, not temporarily.

📅 Interview Process | CV review → Intro call → Technical interview → Team conversations → Offer

If you want to build the backbone of real-time data for a global technology group, hit Apply or send your CV and we’ll set up a confidential chat.

Job Details

Company
Ocean Red Partners
Location
London, UK
Hybrid / Remote Options
Posted