Data Architect
Role Title: Senior Data Architect
Location: London, UK
Work Model: Hybrid (2 days/week onsite)
Contract Duration: 6–12 Months
Role Overview
We are seeking a hands-on Senior Data Architect to lead the design and delivery of scalable, event-driven data platforms for high-volume transactional systems, with a strong focus on payments and financial data.
You will define end-to-end As-Is to To-Be data architecture, build resilient streaming pipelines, and architect AWS-native lakehouse platforms capable of handling tens of millions of events per day. This role requires close collaboration with engineering, platform, and business stakeholders to deliver secure, observable, and high-performance data products.
Key Responsibilities
Data Architecture & Products
- Design and deliver high-performance data products including:
- Channel Operations Warehouse (short-term, low-latency layer)
- Channel Analytics Data Lake (long-term historical layer)
- Define and expose status and statement APIs with clear SLAs.
- Architect AWS lakehouse solutions using S3, Glue, Athena, Iceberg, and Redshift.
- Enable analytics and dashboards using Amazon QuickSight.
Streaming & Event-Driven Design
- Build event-driven pipelines using Kafka (Confluent/MSK), Kinesis, Kinesis Firehose.
- Implement CDC, connectors, partitioning, replay, retention, and idempotency patterns.
- Define event contracts using Avro/Protobuf with Schema Registry, compatibility rules, and versioning.
- Use Amazon EventBridge for AWS-native event routing and filtering.
Migration & Transformation
- Assess APIs, file feeds, SWIFT messages, Aurora PostgreSQL, and Kafka topics.
- Define migration waves, cutover strategies, and runbooks.
Governance, Security & Quality
- Apply data mesh and data-as-a-product principles.
- Define data ownership, access controls, lineage, and retention.
- Implement security using IAM, KMS encryption, tokenization, and audit trails.
Observability & Performance
- Build monitoring using Grafana, Prometheus, CloudWatch.
- Track KPIs such as throughput, lag, success rate, and cost efficiency.
Hands-on Engineering
- Develop/review code in Python, Scala, SQL.
- Build pipelines using Spark, AWS Glue, Lambda, Step Functions.
- Implement Terraform and CI/CD using GitLab/Jenkins.
Must-Have Skills & Experience
- Total experience: 12+ years in data/engineering roles
- Relevant as Architect: 5+ years as Data / Solution / Streaming Architect
- Strong expertise in Kafka (Confluent/MSK) and AWS Kinesis
- Hands-on experience with Schema Registry, schema evolution & governance
- Proven experience in ISO 20022 payments messaging (PAIN, PACS, CAMT, SWIFT MX)
- Strong background in AWS Lakehouse using S3, Glue, Athena, Iceberg, Redshift
- Experience using Amazon EventBridge for event routing/orchestration
- Solid understanding of data modeling, CQRS, event sourcing & DDD
- Strong AWS experience: Lambda, Step Functions, IAM, KMS
- Excellent communication and stakeholder management skills