Senior Data Engineer

Senior Data Engineer

London - 2 days a week in the office

Up to 95k

A high-growth digital platform is transforming a multi-billion-dollar global industry through innovative online technologies. The platform connects a large network of customers and partners worldwide, powering billions in annual transactions. With significant revenue growth and international reach, the company is modernising one of the last sectors to fully embrace digital transformation, building a global, category-defining business in the process.

As the organisation scales its AI and machine learning capabilities across search, recommendations, and analytics, it is investing heavily in robust data infrastructure to enable rapid experimentation, reliable insights, and data-driven decision-making. The Senior Data Engineer will collaborate with Product Managers, ML Engineers, Analysts, and Software Engineers to design and maintain the data pipelines and infrastructure that power AI-driven features and business intelligence, handling millions of events and requests daily.

Key Responsibilities:

  • Design, build, and maintain reliable ETL/ELT pipelines to support analytics, ML models, and business intelligence
  • Develop scalable batch and streaming data pipelines to process millions of events and transactions daily
  • Implement workflow orchestration using Airflow, Dagster, or similar tools
  • Build data validation and quality monitoring frameworks to ensure data accuracy and reliability
  • Collaborate cross-functionally with Software, ML, and Analytics teams to deliver production-ready data products
  • Mentor junior engineers and contribute to engineering best practices

Required Skills & Experience:

  • 5+ years of experience building and maintaining data pipelines in production environments
  • Strong Python and SQL skills (Pandas, PySpark, query optimisation)
  • Cloud experience (AWS preferred) including S3, Redshift, Glue, Lambda
  • Familiarity with data warehousing (Redshift, Snowflake, BigQuery)
  • Experience with workflow orchestration tools (Airflow, Dagster, Prefect)
  • Understanding of distributed systems, batch and streaming data (Kafka, Kinesis)
  • Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes)

Nice to have:

  • Experience with dbt, feature stores, or ML pipeline tooling
  • Familiarity with Elasticsearch or real-time analytics (Flink, Materialize)
  • Exposure to eCommerce, marketplace, or transactional environments
Company
Xcede
Location
United Kingdom, UK
Employment Type
Part-time
Posted
Company
Xcede
Location
United Kingdom, UK
Employment Type
Part-time
Posted