Data Engineer
Data Engineer – Permanent
Location: London (Hybrid)
Salary: £65,000–£95,000 + Benefits (DOE)
Employment Type: Full-time, Permanent
Overview
We are seeking an experienced Data Engineer to join our growing technology and analytics function in London. This role is ideal for someone who thrives in building modern data platforms, enabling high‐quality data pipelines, and collaborating closely with Data Science, Engineering, and Product teams to deliver scalable data solutions.
The successful candidate will play a key role in designing, developing, and maintaining the data infrastructure that powers analytics, reporting, automation, and AI/ML use cases across the organisation.
Key Responsibilities
- Design, build, and maintain scalable, secure, and reliable data pipelines (batch and streaming).
- Develop and optimise ETL/ELT workflows to ensure efficient data ingestion and transformation.
- Build and maintain data models, feature stores, and data marts to support analytics and AI initiatives.
- Work with cloud platforms (AWS, Azure, or GCP) to design and deploy data solutions.
- Implement data quality rules, validation processes, and observability tooling.
- Collaborate closely with Data Scientists, Analysts, and Software Engineers to deliver high‐impact projects.
- Contribute to the evolution of the data architecture, including modernising legacy systems.
- Support CI/CD, Infrastructure‐as‐Code, and best‐practice engineering standards.
- Ensure governance, security, and compliance across all data assets.
Essential Skills & Experience
- Strong experience with Python and/or SQL in a production environment.
- Hands-on experience with modern data engineering frameworks such as:
- Apache Spark, Databricks
- Airflow / Prefect / Dagster
- Kafka / EventHub / Kinesis
- Experience building data solutions on at least one major cloud platform (AWS, Azure, or GCP).
- Strong understanding of data warehousing concepts, dimensional modelling, and schema design.
- Experience working with CI/CD pipelines and Git-based workflows.
- Familiarity with Infrastructure-as-Code (Terraform, CloudFormation, ARM, etc.).
- Strong understanding of data quality, testing, and observability practices.
Desirable Skills
- Experience supporting Machine Learning pipelines or feature engineering.
- Knowledge of containerisation (Docker, Kubernetes).
- Experience with dbt for modelling and transformation.
- Background in financial services, payments, trading, or other data-rich environments.
- Experience with real-time analytics and low‐latency data architectures.
Personal Attributes
- Problem-solver with strong analytical thinking.
- Comfortable working in cross-functional teams.
- Proactive, curious, and able to work with autonomy.
- Strong communication skills, able to translate technical concepts to non‐technical stakeholders.
Benefits
- Competitive salary + bonus
- Pension & healthcare
- Learning & development budget
- Hybrid working (typically 2–3 days in the London office)
- Opportunities to work on high‐impact data and AI projects