Senior Data Engineer – Google Cloud Platform (GCP)
Senior Data Engineer – Google Cloud Platform (GCP)
Role Overview
We are seeking a highly skilled Senior Data Engineer – Google Cloud Platform (GCP) to design, build, and optimise scalable, secure, and high-performance data pipelines and analytics platforms on Google Cloud. This is a hands-on engineering role requiring strong expertise in cloud-native data engineering, distributed processing, analytics platforms, and modern CI/CD practices.
You will partner closely with product teams, analysts, platform engineers, and architects to translate business requirements into reliable, production-grade data solutions—while driving best practices in automation, data quality, observability, performance tuning, and cost optimisation.
Are you a hands-on Data Engineer who loves building scalable, secure, high-performance data platforms on Google Cloud We’re looking for a Senior Data Engineer to design and optimise batch + streaming pipelines, enable analytics, and drive best practices across data engineering and CI/CD.
What you’ll do
- Build/operate pipelines using Dataflow (Apache Beam), Dataproc (Spark), Pub/Sub
- Model and optimise BigQuery (partitioning, clustering, cost & performance)
- Deliver ELT with dbt (models, incremental loads, testing, documentation/lineage)
- Implement CI/CD for data workloads + Terraform IaC
- Improve data quality, monitoring, alerting, reliability and operational runbooks
Must-have skills ✅ GCP: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud Composer (Airflow)
✅ Strong SQL + Python (Spark/Java a plus)
✅ dbt for BigQuery + testing/data quality
✅ CI/CD (Jenkins/GitHub Actions/GitLab CI) + Terraform
Nice to have
- GCP certs: Professional Data Engineer / Cloud Architect
- Governance/lineage/catalog exposure, enterprise-scale platforms
- IBM DataStage (or similar ETL)