Data Engineer

πŸš€ Are you a Data Engineer who enjoys building production-grade pipelines, optimising performance, and working with modern Python tooling (DuckDB/Polars) on time-series datasets

I’m supporting a UK-based fintech in their search for a hands-on Python Data Engineer to help build and improve the data infrastructure powering a unified data + analytics API for financial markets participants.

You’ll sit in a engineering/analytics team and take ownership of pipelines end-to-end β€” from onboarding new datasets through to reliability, monitoring and data quality in production.

In this role, you’ll:

  • πŸ”§ Build, streamline and improve ETL/data pipelines (prototype β†’ production)
  • πŸ“ˆ Ingest and normalise high-velocity time-series datasets from multiple external sources
  • βš™οΈ Work heavily in Python with a modern stack including DuckDB and Polars (plus Parquet/PyArrow)
  • 🧩 Orchestrate workflows and improve reliability (they use Temporal β€” similar orchestration experience is fine)
  • βœ… Improve data integrity and visibility: validations, automated checks, backfills, monitoring/alerting
  • πŸ“Š Support downstream analytics and client-facing outputs (dashboards/PDF/Plotly β€” least important)

What’s in it for you?

  • πŸ“Œ Modern data stack – DuckDB/Polars + Parquet/Arrow in a genuinely hands-on environment
  • πŸ“ˆ Ownership & impact – You’ll be close to the data flows and have real influence on performance and reliability
  • 🏦 Market data exposure – Work with complex financial datasets (experience helpful, interest is enough)
  • 🏒 Hybrid London – London preferred, with 2–3 days in the office
  • ⚑ Start ASAP – Interviewing now

What my client is looking for:

  • Strong Python + SQL fundamentals (data engineering / ETL / pipeline ownership)
  • Hands-on experience with DuckDB and/or Polars (DuckDB especially valuable)
  • Experience operating pipelines in production (monitoring, backfills, incident/RCA mindset, data quality)
  • Cloud experience with demonstrable production use (Azure preferred)
  • Clear communicator, comfortable working across engineering/analytics stakeholders

Nice to have:

  • Time-series data experience (market data, telemetry, pricing, events)
  • Streaming exposure (Kafka/Event Hubs/Kinesis)
  • Experience with Temporal (or similar orchestrators like Airflow/Dagster/Prefect)
  • Any exposure to AI agents / automation tooling

πŸ‘‰ Apply now!

Job Details

Company
Intellect Group
Location
City of London, London, United Kingdom
Posted