GCP Finops Engineer
Job Title: GCP FinOps Engineer
Location: Newport, UK (Hybrid)
Contract Duration: 6 Months
IR35 Status: Inside IR35
Role Overview
We are seeking an experienced GCP FinOps Engineer to optimise cloud spend, performance, and operational efficiency across large-scale data analytics and containerised workloads. You will work closely with engineering, data, and product teams to embed cost-efficient architectures, enforce financial governance, and implement best practices across Google Cloud environments.
Responsibilities
-
Optimise large-scale data analytics workloads via partitioning, clustering, query rewrites, storage format improvements, and lifecycle policies.
-
Tune containerised microservices by recalibrating CPU/memory requests, improving autoscaling efficiency, and restructuring workload placement on cost-efficient compute.
-
Redesign workflow orchestration pipelines for parallel execution, higher concurrency, and offloading heavy tasks to lower-cost execution environments.
-
Analyse distributed data-processing pipelines to right-size worker types, adjust scaling thresholds, and adopt low-cost compute for batch workloads.
-
Reduce log-processing and storage overhead through log-level standardisation, routing rules, exclusion filters, and retention optimisation.
-
Implement storage-tiering strategies based on access patterns and enforce lifecycle rules to minimise cold data retention costs.
-
Improve relational database performance through index tuning, connection optimisation, and instance right-sizing.
-
Enhance horizontally scalable database performance via autoscaling policies, index improvements, and mitigation of read/write hotspots.
-
Build dashboards, budgets, alerts, and guardrails to drive ongoing cost governance and financial accountability.
-
Collaborate with engineering teams to embed cost-efficient architecture patterns and operational best practices.
Key Skills / Knowledge
-
5+ years of hands-on experience in Google Cloud.
-
Strong understanding of GCP data services, including indexing, slots, pruning, partitioning, and clustering.
-
Expert-level Kubernetes & GKE resource tuning.
-
Hands-on experience with Dataflow pipelines and worker optimisation.
-
Strong Airflow/Composer knowledge (DAG design, scheduling, PodOperator).
-
Deep understanding of Cloud Logging, routing, sinks, and exclusion filters.
-
Experience with Cloud Spanner autoscaling, indexing, and schema optimisation.
-
Cloud SQL performance tuning and indexing.
-
Ability to analyse billing data, resource consumption, and quantify cost savings.
-
Experience using GCP Cost Explorer, Recommender API, and Billing Export.
-
Build dashboards, alerts, and budget guardrails for cost governance.
-
Excellent communication and stakeholder management skills.
-
Ability to collaborate across engineering, data, and product teams.
-
Structured problem-solving mindset; ownership-driven, proactive, and independent.