HR Data Engineer (Contract)
The Role
A leading organisation in the aviation sector is undertaking a major HR data transformation programme, migrating from a legacy Oracle environment into Snowflake to create a unified, scalable HR data platform.
The core data model has already been designed. The next phase focuses on:
- Building logical data layers
- Developing scalable ETL pipelines
- Delivering a centralised HR data product
- Enabling Power BI reporting and improved organisational insight
This is a highly hands-on engineering role with significant influence over architectural decisions and delivery pace.
What You'll Be Doing Core Responsibilities- Designing and developing ETL pipelines within Snowflake
- Working at the logical data modelling layer
- Challenging and supporting existing data architecture
- Collaborating with architects, data engineers, and insight teams
- Driving delivery of the HR data product within tight 4-6 week timelines
- Acting as a technical sounding board for architecture decisions
- Providing guidance and support to other engineers
- Snowflake
- DBT
- Python (including Lambda for AWS workflows)
- AWS (data pipelines + infrastructure)
- Oracle (legacy system exposure beneficial)
- Power BI (understanding of downstream consumption)
- Experience working with HR / People data is highly desirable
- Strong understanding of HR data structures and business logic
- Ability to translate engineering work into real business impact
- Migration from Oracle Snowflake
- Handling both current and historical datasets
- Supporting testing phases: CIT and UIT
- Architectural design completing in 1-2 weeks
- Engineering build to begin immediately
- Initial HR core module delivery in 4-6 weeks
- Longer-term (up to 6 months):
- Feeding datasets into reporting
- Validation & analysis
- UAT beginning September
- A fully unified, centralised HR data platform
- Integrated seamlessly with Power BI
- Scalable, reliable pipelines for ongoing analytics
- Stronger insight and reporting for HR and People functions
Stage 1 - 30 mins CV walkthrough + technical discussion with a Data Engineer
Stage 2 - 45 mins Competency + technical deep dive covering:
- DBT development & modelling
- Snowflake engineering
- AWS pipelines (Python / Lambda)
- Data migration approach
Onsite availability required Wednesday/Thursday in week two.
Ideal Candidate Profile- Strong hands-on experience with Snowflake + DBT
- Proven track record building AWS-based Python/Lambda pipelines
- Background in data migration projects
- Confident challenging architecture and taking ownership
- Experience with HR/People data strongly preferred
- Comfortable delivering at speed and under time pressure