Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
Senior DataEngineer (GCP/Kafka) Apply locations: Bristol Harbourside, London 25 Gresham Street Time type: Full time Posted on: Posted Yesterday Time left to apply: End Date: May 12, 2025 (25 days left to apply) Job requisition id: 111909 End Date: Sunday 11 May 2025 Salary Range: £68,202 - £75,780 Flexible Working Options … real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers ( Docker, Kubernetes etc). Cloud: Experience with GCP, AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional DataEngineerMore ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
have a legal right to work in the UK without requiring sponsorship, to be considered for this position. ABOUT THIS OPPORTUNITY A great opportunity has arisen for a DataEngineer to work within the Personalised Experiences and Communications Platform to join product engineering cross-functional teams. As a DataEngineer, your responsibilities … real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g., Hadoop, Kafka, GCP, Azure, Teradata, SQL Server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes , etc.). Cloud Experience with GCP, AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional DataEngineerMore ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
GCPDataEngineer Location: London (remote-first) Salary: £80,000 – £130,000 depending on experience + 10% bonus We’re looking for a highly skilled and innovation-focused GCPDataEngineer to join our AI engineering team. This is a remote-first role (with a London-based HQ) offering the … supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloudPlatform (GCP) services including BigQuery, Dataflow, Cloud Functions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion, embedding pipelines, and vector … dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain, LangGraph, LangFlow, CrewAI, OpenAI APIs What We’re Looking For: Strong experience building and maintaining data systems on GCP Direct experience working on Google projects Experience with Agentic AI Proficiency in Python and SQL Familiarity with vector databases, embedding models, and semantic search techniques A background working alongside More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
GCPDataEngineer Location: London (remote-first) Salary: £80,000 – £130,000 depending on experience + 10% bonus We’re looking for a highly skilled and innovation-focused GCPDataEngineer to join our AI engineering team. This is a remote-first role (with a London-based HQ) offering the … supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloudPlatform (GCP) services including BigQuery, Dataflow, Cloud Functions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion, embedding pipelines, and vector … dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain, LangGraph, LangFlow, CrewAI, OpenAI APIs What We’re Looking For: Strong experience building and maintaining data systems on GCP Direct experience working on Google projects Experience with Agentic AI Proficiency in Python and SQL Familiarity with vector databases, embedding models, and semantic search techniques A background working alongside More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
CenterXchange Inc
We're seeking a Lead DataEngineer, experienced in GCP, to join our Technology team here at N Brown Group! This role is a balance of hands-on data engineering alongside technical leadership and coaching, working within an agile operating environment. What will you do as a Lead DataEngineer at N Brown? Lead a team of engineers in creating, maintaining, and extending our analytics platformData ETL - Design patterns for ingesting, transforming, and exposing data Drive adoption of strong CI/CD practices to reduce deployment risks Coach your team in best practices and coding standards Develop your team's software development capabilities … with GoogleCloudPlatform stack (BigQuery, Composer, Dataplex, Dataflow, Cloud Functions, Cloud Run, Pub/Sub, GCS, Vertex AI, GKE) or similar cloud platforms Familiarity with open-source data-stack tools (Airflow, DBT, Kafka, Great Expectation, etc.) Appreciation of modern clouddata stack, headless BI, analytics engineering, Data Mesh, and Lake House Although not More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
for designing, building, and modernizing mission-critical systems that power some of the world's most vital operations. As part of their continued growth, they are seeking an experienced GCPDataEngineer to join their collaborative, cross-functional team in London. The Role – GCPDataEngineer As a GCP … Engineer , you will play a pivotal role in shaping data platforms across a range of cloud environments, with a strong focus on GoogleCloudPlatform (GCP). You’ll be involved in full-lifecycle data projects – from ingestion and transformation through to analytics and visualization – all while collaborating closely with data … work on multi-client, multi-cloud environments and drive innovation across complex data ecosystems. Key Responsibilities Design and implement scalable, high-performance data platforms within GCP Develop and manage ETL pipelines, ensuring quality and consistency across the data lifecycle Collaborate with cross-functional teams to integrate data flows across multiple sources More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: DataEngineer (GCP), london (city of london) col-narrow-left Client: Anson McCade Location: london (city of london), United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 16.06.2025 Expiry Date: 31.07.2025 col-wide Job Description: Location: London (Hybrid … for designing, building, and modernizing mission-critical systems that power some of the world's most vital operations. As part of their continued growth, they are seeking an experienced GCPDataEngineer to join their collaborative, cross-functional team in London. The Role – GCPDataEngineer As a GCP … Engineer , you will play a pivotal role in shaping data platforms across a range of cloud environments, with a strong focus on GoogleCloudPlatform (GCP). You’ll be involved in full-lifecycle data projects – from ingestion and transformation through to analytics and visualization – all while collaborating closely with dataMore ❯
products on ODE, Operations of the data products on ODE Activity description and concrete tasks: Infrastructure Deployment & Management: Efficiently deploy and manage infrastructure on GoogleCloudPlatform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing. Data Processing & Transformation: Utilize … Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow. Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform. Application Implementation: Develop and implement Python applications for various GCP services. CI/… Testing (TEST), and Production (PROD) environments. AI Solutions : Implement AI solutions using Google's Vertex AI for building and deploying machine learning models. Certification Desired: Must be a certified GCPCloud Architect or Data Engineer. Qualifications Skills Required: Strong proficiency in GoogleCloudPlatform (GCP) Expertise in Terraform for infrastructure management Skilled in Python for application More ❯
London, England, United Kingdom Hybrid / WFH Options
Highnic
Join to apply for the GCPDataEngineer (Java, Spark, ETL) role at Good Chemical Science & Technology Co. Ltd. Responsibilities Develop, implement, and optimize Real Time data processing workflows using GoogleCloudPlatform services such as Dataflow, Pub/Sub, and BigQuery Streaming. Design and develop ETL processes for data ingestion … and preparation. Work with GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Run. Utilize programming languages such as Python, Java, and Pyspark. Use version control tools (Git, GitHub) and automated deployment tools. Apply knowledge of data orchestration tools like GoogleCloudPlatformCloud Composer. Possibly obtain and leverage GoogleCloudPlatform certifications. Qualifications … Proficiency in programming languages such as Python and Java. Experience with SparkSQL, GCP BigQuery, and real-time data processing. Understanding of event-driven architectures. Familiarity with Unix/Linux platforms. Deep understanding of real-time data processing and event-driven architectures. Strong knowledge of GoogleCloudPlatform services and tools. GoogleCloudPlatform certification(s More ❯