Remote Permanent GCP Data Engineer Job Vacancies

9 of 9 Remote Permanent GCP Data Engineer Jobs

Senior Data Engineer (GCP/Kafka)

Bristol, England, United Kingdom
Hybrid / WFH Options
Lloyds Bank plc
Senior Data Engineer (GCP/Kafka) Apply locations: Bristol Harbourside, London 25 Gresham Street Time type: Full time Posted on: Posted Yesterday Time left to apply: End Date: May 12, 2025 (25 days left to apply) Job requisition id: 111909 End Date: Sunday 11 May 2025 Salary Range: £68,202 - £75,780 Flexible Working Options … real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers ( Docker, Kubernetes etc). Cloud: Experience with GCP, AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer More ❯
Posted:

Senior Data Engineer (GCP/Kafka)

London, England, United Kingdom
Hybrid / WFH Options
Lloyds Banking Group
have a legal right to work in the UK without requiring sponsorship, to be considered for this position. ABOUT THIS OPPORTUNITY A great opportunity has arisen for a Data Engineer to work within the Personalised Experiences and Communications Platform to join product engineering cross-functional teams. As a Data Engineer, your responsibilities … real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g., Hadoop, Kafka, GCP, Azure, Teradata, SQL Server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes , etc.). Cloud Experience with GCP, AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer More ❯
Posted:

GCP Data Engineer

City of London, England, United Kingdom
Hybrid / WFH Options
Anson McCade
GCP Data Engineer Location: London (remote-first) Salary: £80,000 – £130,000 depending on experience + 10% bonus We’re looking for a highly skilled and innovation-focused GCP Data Engineer to join our AI engineering team. This is a remote-first role (with a London-based HQ) offering the … supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using Google Cloud Platform (GCP) services including BigQuery, Dataflow, Cloud Functions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion, embedding pipelines, and vector … dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain, LangGraph, LangFlow, CrewAI, OpenAI APIs What We’re Looking For: Strong experience building and maintaining data systems on GCP Direct experience working on Google projects Experience with Agentic AI Proficiency in Python and SQL Familiarity with vector databases, embedding models, and semantic search techniques A background working alongside More ❯
Posted:

GCP Data Engineer

South East London, England, United Kingdom
Hybrid / WFH Options
Anson McCade
GCP Data Engineer Location: London (remote-first) Salary: £80,000 – £130,000 depending on experience + 10% bonus We’re looking for a highly skilled and innovation-focused GCP Data Engineer to join our AI engineering team. This is a remote-first role (with a London-based HQ) offering the … supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using Google Cloud Platform (GCP) services including BigQuery, Dataflow, Cloud Functions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion, embedding pipelines, and vector … dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain, LangGraph, LangFlow, CrewAI, OpenAI APIs What We’re Looking For: Strong experience building and maintaining data systems on GCP Direct experience working on Google projects Experience with Agentic AI Proficiency in Python and SQL Familiarity with vector databases, embedding models, and semantic search techniques A background working alongside More ❯
Posted:

Lead Data Engineer - GCP

Manchester, England, United Kingdom
Hybrid / WFH Options
CenterXchange Inc
We're seeking a Lead Data Engineer, experienced in GCP, to join our Technology team here at N Brown Group! This role is a balance of hands-on data engineering alongside technical leadership and coaching, working within an agile operating environment. What will you do as a Lead Data Engineer at N Brown? Lead a team of engineers in creating, maintaining, and extending our analytics platform Data ETL - Design patterns for ingesting, transforming, and exposing data Drive adoption of strong CI/CD practices to reduce deployment risks Coach your team in best practices and coding standards Develop your team's software development capabilities … with Google Cloud Platform stack (BigQuery, Composer, Dataplex, Dataflow, Cloud Functions, Cloud Run, Pub/Sub, GCS, Vertex AI, GKE) or similar cloud platforms Familiarity with open-source data-stack tools (Airflow, DBT, Kafka, Great Expectation, etc.) Appreciation of modern cloud data stack, headless BI, analytics engineering, Data Mesh, and Lake House Although not More ❯
Posted:

Data Engineer-GCP

London, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
for designing, building, and modernizing mission-critical systems that power some of the world's most vital operations. As part of their continued growth, they are seeking an experienced GCP Data Engineer to join their collaborative, cross-functional team in London. The Role – GCP Data Engineer As a GCPEngineer , you will play a pivotal role in shaping data platforms across a range of cloud environments, with a strong focus on Google Cloud Platform (GCP). You’ll be involved in full-lifecycle data projects – from ingestion and transformation through to analytics and visualization – all while collaborating closely with data … work on multi-client, multi-cloud environments and drive innovation across complex data ecosystems. Key Responsibilities Design and implement scalable, high-performance data platforms within GCP Develop and manage ETL pipelines, ensuring quality and consistency across the data lifecycle Collaborate with cross-functional teams to integrate data flows across multiple sources More ❯
Posted:

Data Engineer (GCP)

City of London, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: Data Engineer (GCP), london (city of london) col-narrow-left Client: Anson McCade Location: london (city of london), United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 16.06.2025 Expiry Date: 31.07.2025 col-wide Job Description: Location: London (Hybrid … for designing, building, and modernizing mission-critical systems that power some of the world's most vital operations. As part of their continued growth, they are seeking an experienced GCP Data Engineer to join their collaborative, cross-functional team in London. The Role – GCP Data Engineer As a GCPEngineer , you will play a pivotal role in shaping data platforms across a range of cloud environments, with a strong focus on Google Cloud Platform (GCP). You’ll be involved in full-lifecycle data projects – from ingestion and transformation through to analytics and visualization – all while collaborating closely with data More ❯
Posted:

Data Engineer GCP

Madrid, Spain
Hybrid / WFH Options
T-Systems Iberia
products on ODE, Operations of the data products on ODE Activity description and concrete tasks: Infrastructure Deployment & Management: Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing. Data Processing & Transformation: Utilize … Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow. Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform. Application Implementation: Develop and implement Python applications for various GCP services. CI/… Testing (TEST), and Production (PROD) environments. AI Solutions : Implement AI solutions using Google's Vertex AI for building and deploying machine learning models. Certification Desired: Must be a certified GCP Cloud Architect or Data Engineer. Qualifications Skills Required: Strong proficiency in Google Cloud Platform (GCP) Expertise in Terraform for infrastructure management Skilled in Python for application More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

GCP Data Engineer (Java, Spark, ETL)

London, England, United Kingdom
Hybrid / WFH Options
Highnic
Join to apply for the GCP Data Engineer (Java, Spark, ETL) role at Good Chemical Science & Technology Co. Ltd. Responsibilities Develop, implement, and optimize Real Time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. Design and develop ETL processes for data ingestion … and preparation. Work with GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Run. Utilize programming languages such as Python, Java, and Pyspark. Use version control tools (Git, GitHub) and automated deployment tools. Apply knowledge of data orchestration tools like Google Cloud Platform Cloud Composer. Possibly obtain and leverage Google Cloud Platform certifications. Qualifications … Proficiency in programming languages such as Python and Java. Experience with SparkSQL, GCP BigQuery, and real-time data processing. Understanding of event-driven architectures. Familiarity with Unix/Linux platforms. Deep understanding of real-time data processing and event-driven architectures. Strong knowledge of Google Cloud Platform services and tools. Google Cloud Platform certification(s More ❯
Posted:
GCP Data Engineer
10th Percentile
£45,223
25th Percentile
£45,558
Median
£46,754
75th Percentile
£48,588
90th Percentile
£49,305