Senior Data Engineer (GCP)

Job Ref :- 842 | Sr. Data Engineer | GCP | Hybrid | Outside IR35)

Apply via LinkedIn or Email CV :-HR@AGITCONSULTANCY.CO.UK

About the Role

We are seeking a highly experienced Senior Data Engineer to lead the design, development, and optimization of modern, cloud-based data platforms across Azure and GCP environments.

This is a hands-on technical leadership role where you will take ownership of enterprise-scale data engineering initiatives, define architecture standards, and mentor engineers while delivering high-quality, scalable solutions.

You will play a pivotal role in shaping a strategic cloud data platform, enabling advanced analytics and data-driven decision-making across the organisation. Working within a high-performing cloud and data team, you’ll help drive innovation, introduce modern data capabilities, and ensure platforms are secure, resilient, and cost-efficient.

This is an end-to-end role with real ownership, impact, and influence on the data roadmap.

______________

Key Responsibilities

Data Architecture & Engineering Leadership

• Lead the design and implementation of scalable, secure, and robust data pipelines using ETL/ELT frameworks across GCP.

• Define and promote best practices for data engineering, modular design, and orchestration

• Provide technical leadership, mentoring, and support across the data engineering team

Cloud Data Platform & Infrastructure

• Design and build enterprise-grade data solutions across Azure and GCP (Data Lake, Synapse, BigQuery, event streaming)

• Own infrastructure using Infrastructure as Code (Terraform/Bicep) and implement CI/CD pipelines

• Ensure platforms are highly available, fault-tolerant, and optimised for performance and cost

Pipeline Design & Optimisation

• Develop batch, real-time, and streaming pipelines using tools such as Airflow, ADF, dbt, and GCP-native services (Dataflow, Dataproc, Cloud Composer)

• Design and implement scalable GCP ETL pipelines using BigQuery, Dataflow (Apache Beam), Pub/Sub, and Cloud Storage

• Monitor, troubleshoot, and optimise pipelines, storage, and query performance

Data Governance, Security & Compliance

• Embed security and privacy-by-design principles, including encryption, masking, and access controls

• Ensure compliance with GDPR and enterprise data governance standards

• Implement data lineage, classification, and audit logging

Integration & Interoperability

• Integrate with enterprise systems (ERP, APIs, SFTP, event streams)

• Build reusable ingestion frameworks for structured and semi-structured data (JSON, XML, Parquet, Avro)

Data Quality & Observability

• Implement data quality and testing frameworks (dbt, Great Expectations, or similar)

• Establish monitoring, alerting, and observability across pipelines and platforms

Collaboration & Delivery

• Work closely with engineers, analysts, and stakeholders to deliver scalable data solutions

• Contribute to agile planning, estimation, and delivery

• Document architecture, standards, and data models

______________

Essential Skills & Experience

• 7+ years’ experience in data engineering, including leading delivery in enterprise environments

• Expert-level SQL and Python for data transformation and automation

• Strong hands-on experience with GCP data services, including:

o BigQuery (data warehousing & ELT)

o Dataflow (Apache Beam for ETL processing)

o Pub/Sub (event streaming ingestion)

o Cloud Storage (data lake ingestion layers)

o Cloud Composer (Airflow orchestration)

• Proven experience building and optimising lakehouse architectures (Delta Lake, Databricks, Snowflake, BigQuery)

• Hands-on experience with Airflow, ADF, dbt, and data modelling (dimensional/star/snowflake)

• Experience with APIs and event streaming platforms (Kafka, Azure Event Hub, Pub/Sub)

• Strong understanding of data security, GDPR, encryption, and IAM

• Experience with CI/CD pipelines, version control, and DevOps practices

______________

Desirable Skills

• Experience in manufacturing or FMCG environments (ERP, MES, TPM)

• Familiarity with data governance/cataloguing tools (e.g., Microsoft Purview)

• Exposure to multi-cloud or hybrid architectures

• Experience with BI tools (Power BI, Tableau)

• Certifications such as:

o GCP Professional (Cloud Architect, Data Engineer, DevOps, Security)

o Azure Data Engineer (DP-203) / Azure DevOps Engineer

o AWS Professional certifications

Job Details

Company
AGIT
Location
City of London, London, United Kingdom
Posted