Data Engineer
Job Title: Data Engineer Location: Hybrid or remote Term: 3 month fixed term contract Renumeration: £12,500 for 3 months About Birchgrove Birchgrove is the only build-to-rent operator in the UK exclusively for older adults. Our mission is to enrich the lives of our neighbours and add healthy years to their lives. We operate neighbourhoods rather than care homes, placing independence, dignity and community at the heart of what we do.We’re a forward-thinking organisation using data to improve neighbour wellbeing, operational performance and long-term decision-making.The Opportunity We’re looking for an experienced Data Engineer to join Birchgrove on a 3-month contract to deliver several clearly defined, high-impact data integration projects.This is a hands-on, delivery-focused role. You’ll design, build and document reliable, production-grade ETL/ELT pipelines that integrate operational systems into our cloud data warehouse enabling improved reporting and analytics across the business.You’ll be joining at an exciting stage in our data journey, helping us move from early foundations to a more connected, scalable and dependable data platform.Key Project Deliverables During the contract, you will deliver the following priority projects:1) Fall detection system integration• Ingest data from a fall detection platform using APIs and webhooks• Land and model the data in Snowflake• Implement reliability best practices: monitoring, alerting, logging, retries, and clear documentation2) Resident management system integration• Extract and ingest data from our resident management system• Design robust data models to support reporting on neighbour wellbeing and operations• Ensure maintainable transformations and clear data definitions3) Facilities management systems integration• Design and build an API-based integration between two facilities management systems• Enable joined-up reporting across maintenance, safety and operational data• Deliver clean, consistent datasets suitable for analytics and dashboards4) Marketing automation platform integration• Ingest data from our marketing platform using APIs• Land and model the data in SnowflakeThese projects will directly support improved insight, faster decision-making and better outcomes for our neighbours and team.Tools & Technology Stack You’ll work with and help establish best practice around the following tools:• Snowflake (cloud data warehouse)• Fivetran (managed ingestion)• Airbyte (custom & API-based integrations)• dbt (transformations, testing and documentation)• Power BI (analytics and dashboards)We’re particularly keen to speak with candidates who are highly confident with:• API-driven pipeline design (authentication, pagination, rate limiting, incremental loads)• Webhook ingestion patterns and event-driven data capture• Building reliable, well-monitored pipelines with clear documentation and ownership About You• Proven experience as a Data Engineer, delivering pipelines end-to-end in modern cloud stacks• Strong hands-on skills with APIs, webhooks, and pipeline-based ETL/ELT• Confident using Python for data integration and automation• Comfortable implementing practical reliability patterns (e.g., idempotency, retries, monitoring, alerting)• Strong data modelling and transformation experience (ideally with dbt)• Able to work independently, but collaborate closely with non-technical stakeholders• Motivated by purpose-driven work and using data to improve real lives How to Apply If you’re an experienced Data Engineer looking for a short-term contract where you can deliver meaningful work with real-world impact, we’d love to hear from you.REF-226 169