backend systems and APIs Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load into GoogleCloud environments Data Transformation: Implement data transformation processes including cleansing, normalization, and aggregation to ensure data quality and consistency Data Modeling: Develop and maintain data models and schemas to support efficient data storage and retrieval on GoogleCloud platforms Data Integration: Integrate data from multiple sources (on-prem and cloud-based) using Cloud Composer or other tools Data Lakes: Build data lakes using GoogleCloud services such as BigQuery Performance Optimization: Optimize data … engineering with strong focus on GCP-based solutions Proficiency in GCPData & AI services (BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, CloudData Fusion, Cloud Composer, Python, SQL) Experience designing, developing, and deploying scalable, secure GCP solutions Ability to translate business requirements into technical specifications Expertise in GCP services More ❯
GCPDataEngineer - London - £75k +bonus Please note - this role will require you to attend the London based office 2-3 days per week. To be considered for this role you must have the unrestricted right to work in the UK - this organisation can not offer sponsorship. Are you a skilled DataEngineer … and help shape the future of their clouddata platform. Key Responsibilities: * Design, build, and maintain scalable data pipelines and ETL processes in GoogleCloudPlatform (GCP). * Lead and contribute to a major cloud migration project from Azure to GCP, ensuring seamless data integration and minimal disruption. * Collaborate with data … more junior members of staff. Required Skills & Experience: * Proven experience as a DataEngineer in a commercial environment. * Strong hands-on experience with GoogleCloudPlatform (GCP) services (e.g., BigQuery, Dataflow, Pub/Sub). * Solid understanding of Azure data services and hybrid cloud environments. * Advanced SQL skills and proficiency in Python for More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
Senior DataEngineer (GCP/Kafka) Apply locations: Bristol Harbourside, London 25 Gresham Street Time type: Full time Posted on: Posted Yesterday Time left to apply: End Date: May 12, 2025 (25 days left to apply) Job requisition id: 111909 End Date: Sunday 11 May 2025 Salary Range: £68,202 - £75,780 Flexible Working Options … real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers ( Docker, Kubernetes etc). Cloud: Experience with GCP, AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional DataEngineerMore ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
have a legal right to work in the UK without requiring sponsorship, to be considered for this position. ABOUT THIS OPPORTUNITY A great opportunity has arisen for a DataEngineer to work within the Personalised Experiences and Communications Platform to join product engineering cross-functional teams. As a DataEngineer, your responsibilities … real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g., Hadoop, Kafka, GCP, Azure, Teradata, SQL Server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes , etc.). Cloud Experience with GCP, AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional DataEngineerMore ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
GCPDataEngineer Location: London (remote-first) Salary: £80,000 – £130,000 depending on experience + 10% bonus We’re looking for a highly skilled and innovation-focused GCPDataEngineer to join our AI engineering team. This is a remote-first role (with a London-based HQ) offering the … supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloudPlatform (GCP) services including BigQuery, Dataflow, Cloud Functions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion, embedding pipelines, and vector … dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain, LangGraph, LangFlow, CrewAI, OpenAI APIs What We’re Looking For: Strong experience building and maintaining data systems on GCP Direct experience working on Google projects Experience with Agentic AI Proficiency in Python and SQL Familiarity with vector databases, embedding models, and semantic search techniques A background working alongside More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
GCPDataEngineer Location: London (remote-first) Salary: £80,000 – £130,000 depending on experience + 10% bonus We’re looking for a highly skilled and innovation-focused GCPDataEngineer to join our AI engineering team. This is a remote-first role (with a London-based HQ) offering the … supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloudPlatform (GCP) services including BigQuery, Dataflow, Cloud Functions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion, embedding pipelines, and vector … dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain, LangGraph, LangFlow, CrewAI, OpenAI APIs What We’re Looking For: Strong experience building and maintaining data systems on GCP Direct experience working on Google projects Experience with Agentic AI Proficiency in Python and SQL Familiarity with vector databases, embedding models, and semantic search techniques A background working alongside More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
CenterXchange Inc
We're seeking a Lead DataEngineer, experienced in GCP, to join our Technology team here at N Brown Group! This role is a balance of hands-on data engineering alongside technical leadership and coaching, working within an agile operating environment. What will you do as a Lead DataEngineer at N Brown? Lead a team of engineers in creating, maintaining, and extending our analytics platformData ETL - Design patterns for ingesting, transforming, and exposing data Drive adoption of strong CI/CD practices to reduce deployment risks Coach your team in best practices and coding standards Develop your team's software development capabilities … with GoogleCloudPlatform stack (BigQuery, Composer, Dataplex, Dataflow, Cloud Functions, Cloud Run, Pub/Sub, GCS, Vertex AI, GKE) or similar cloud platforms Familiarity with open-source data-stack tools (Airflow, DBT, Kafka, Great Expectation, etc.) Appreciation of modern clouddata stack, headless BI, analytics engineering, Data Mesh, and Lake House Although not More ❯
Join to apply for the Senior DataEngineer (GCP/Kafka) role at Lloyds Banking Group Join to apply for the Senior DataEngineer (GCP/Kafka) role at Lloyds Banking Group This range is provided by Lloyds Banking Group. Your actual pay will be based on your skills and … scalable real time data applications. Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … in infrastructure as code (IaC) using Terraform Experience with CI/CD pipelines and related tools/frameworks Containerisation Good knowledge of containers ( Docker, Kubernetes etc) Cloud Experience with GCP, AWS or Azure Good understating of cloud storage, networking and resource provisioning It would be great if you had... Certification in GCP "Professional DataEngineerMore ❯
JOB TITLE: Senior DataEngineer SALARY: Bristol £70,930 - £86,690 per annum | London £82,000 - £100,200 per annum LOCATION(S): Bristol or London HOURS: Full-time - 35 hours per week WORKING PATTERN: Our work style is hybrid, which involves spending at least two days per week currently, or 40% of our time, at our … scalable real time data applications. Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in adopting best … as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes etc). Cloud Experience with GCP, AWS or Azure . Good understanding of cloud storage, networking and resource provisioning. It would be great if you had Certification in GCP "Professional DataEngineerMore ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
for designing, building, and modernizing mission-critical systems that power some of the world's most vital operations. As part of their continued growth, they are seeking an experienced GCPDataEngineer to join their collaborative, cross-functional team in London. The Role – GCPDataEngineer As a GCP … Engineer , you will play a pivotal role in shaping data platforms across a range of cloud environments, with a strong focus on GoogleCloudPlatform (GCP). You’ll be involved in full-lifecycle data projects – from ingestion and transformation through to analytics and visualization – all while collaborating closely with data … work on multi-client, multi-cloud environments and drive innovation across complex data ecosystems. Key Responsibilities Design and implement scalable, high-performance data platforms within GCP Develop and manage ETL pipelines, ensuring quality and consistency across the data lifecycle Collaborate with cross-functional teams to integrate data flows across multiple sources More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: DataEngineer (GCP), london (city of london) col-narrow-left Client: Anson McCade Location: london (city of london), United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 16.06.2025 Expiry Date: 31.07.2025 col-wide Job Description: Location: London (Hybrid … for designing, building, and modernizing mission-critical systems that power some of the world's most vital operations. As part of their continued growth, they are seeking an experienced GCPDataEngineer to join their collaborative, cross-functional team in London. The Role – GCPDataEngineer As a GCP … Engineer , you will play a pivotal role in shaping data platforms across a range of cloud environments, with a strong focus on GoogleCloudPlatform (GCP). You’ll be involved in full-lifecycle data projects – from ingestion and transformation through to analytics and visualization – all while collaborating closely with dataMore ❯
Our client is seeking a Senior GCPDataEngineer to join their team in London. This is an exciting opportunity to provide top-tier data capabilities, leveraging your engineering skills and embracing the possibilities offered by cloud technology. You will be part of a purpose-led transformation company that supports businesses in making … engineering processes, data flows, and system configurations for future reference. What you bring: Experience in data engineering with a strong focus on GoogleCloudPlatform (GCP)-based solutions. Proficiency in the GCPplatform, particularly in Data & AI services (e.g., BigQuery, DataProc, Cloud SQL). Ability to translate business requirements into technical … specifications. Experience implementing and managing GCP networking configurations. Knowledge of IaC tools like Terraform for infrastructure provisioning and management. Experience implementing security best practices to protect cloud infrastructure and data. Experience identifying and resolving performance bottlenecks. About the job: Contract Type: Permanent Workplace Type: Hybrid Experience Level: Associate Location: London Specialism: Technology & Digital Focus: Data Analysis & Business More ❯
consulting company, for a GoogleCloud AI/ML Data Engineer. This role focuses on building AI-driven marketing automation solutions using the latest GoogleCloudPlatform (GCP) and Google Marketing Platform (GMP) technologies. You’ll design and implement machine learning pipelines, manage data ingestion, and drive campaign optimisation for high-impact marketing projects. Why … apply? Collaborate with cross-functional teams in a fast-paced, innovative environment. Lead the development of AI/ML solutions that directly influence marketing strategy. Access to the latest GCP and MarTech tools. This is a contract role, and will require you to work 3 days a week from their London based office. If you’re passionate about AI … you will be responsible for: Design and develop AI/ML pipelines using Vertex AI for forecasting, targeting, and creative optimisation. Lead data ingestion and processing using GCP tools (BigQuery, Cloud Functions, Cloud Run, Pub/Sub). Build and maintain scalable APIs for ML endpoints and campaign automation. Manage the full MLOps lifecycle, including CI/ More ❯
Future Talent Pool - GCPDataEngineer, London, hybrid role - digital GoogleCloud transformation programme … Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery GoogleCloudPlatformData Studio Unix/Linux Platform Version control tools (Git, GitHub), automated deployment tools GoogleCloudPlatform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of … real-time data processing and event-driven architectures. Familiarity with data orchestration tools GoogleCloudPlatformcloud composer. GoogleCloudPlatform certification(s) is a strong advantage. Develop, implement, and optimize real-time data processing workflows using GoogleCloudPlatform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely More ❯
Future Talent Pool - GCPDataEngineer, London, hybrid role - digital GoogleCloud transformation programme … Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery GoogleCloudPlatformData Studio Unix/Linux Platform Version control tools (Git, GitHub), automated deployment tools GoogleCloudPlatform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of … real-time data processing and event-driven architectures. Familiarity with data orchestration tools GoogleCloudPlatformcloud composer. GoogleCloudPlatform certification(s) is a strong advantage. Develop, implement, and optimize real-time data processing workflows using GoogleCloudPlatform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely More ❯
London, England, United Kingdom Hybrid / WFH Options
Highnic
Join to apply for the GCPDataEngineer (Java, Spark, ETL) role at Good Chemical Science & Technology Co. Ltd. Responsibilities Develop, implement, and optimize Real Time data processing workflows using GoogleCloudPlatform services such as Dataflow, Pub/Sub, and BigQuery Streaming. Design and develop ETL processes for data ingestion … and preparation. Work with GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Run. Utilize programming languages such as Python, Java, and Pyspark. Use version control tools (Git, GitHub) and automated deployment tools. Apply knowledge of data orchestration tools like GoogleCloudPlatformCloud Composer. Possibly obtain and leverage GoogleCloudPlatform certifications. Qualifications … Proficiency in programming languages such as Python and Java. Experience with SparkSQL, GCP BigQuery, and real-time data processing. Understanding of event-driven architectures. Familiarity with Unix/Linux platforms. Deep understanding of real-time data processing and event-driven architectures. Strong knowledge of GoogleCloudPlatform services and tools. GoogleCloudPlatform certification(s More ❯
Terraform Experience with CI/CD pipelines and associated tools/frameworks. Containerisation Good knowledge of container technologies such as Docker and Kubernetes. Cloud Experience with cloud platforms like GCP, AWS, or Azure. Good understanding of cloud storage, networking, and resource management. #J-18808-Ljbffr More ❯
Social network you want to login/join with: GCPDataEngineer (Java, Spark, ETL), London (City of London) Client: Staffworx Location: London (City of London), United Kingdom Job Category: Other EU work permit required: Yes Job Views: 3 Posted: 16.06.2025 Expiry Date: 31.07.2025 Job Description: Proficiency in programming languages such as Python, PySpark, and … Java SparkSQL GCP BigQuery Version control tools (Git, GitHub), automated deployment tools GoogleCloudPlatform services, Pub/Sub, BigQuery Streaming, and related technologies Deep understanding of real-time data processing and event-driven architectures Familiarity with data orchestration tools like GoogleCloudPlatformCloud Composer GoogleCloudPlatform certification(s) is a strong advantage … Develop, implement, and optimize real-time data processing workflows using GoogleCloudPlatform services such as Dataflow, Pub/Sub, and BigQuery Streaming 6 months initial, likely long-term extensions This advert was posted by Staffworx Limited - a UK-based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an More ❯
permit required: Yes col-narrow-right Job Views: 1 Posted: 31.05.2025 Expiry Date: 15.07.2025 col-wide Job Description: Proficiency in programming languages such as Python, PySpark and Java SparkSQL GCP BigQuery Version control tools (Git, GitHub), automated deployment tools GoogleCloudPlatform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of real-time data processing and event-driven architectures. Familiarity with data orchestration tools GoogleCloudPlatformcloud composer. GoogleCloudPlatform certification(s) is a strong advantage. Develop, implement, and optimize real-time data processing workflows using GoogleCloudPlatform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely long term extensions More ❯
Are you a skilled DataEngineer with expertise in GCP, Java, and Spark? Ready to take on a challenging role in a dynamic environment? This could be the perfect opportunity for you. We are looking for a GCPDataEngineer to work from London three days a week. You will … and fine-tuning data applications. Explore new capabilities and components, contributing to innovation through PoC assessments. Responsibilities: Design and develop applications using Java and Spark on the GCP platform. Tune performance and optimize existing data pipelines. Conduct PoCs for assessing new capabilities and components. Debug and troubleshoot data processing and application performance issues. … Skills/Must Have: Strong expertise in GCP platform. Experience in Java and Spark for processing unstructured data. Proficient in Composer, BigQuery, and SQL. Strong knowledge of designing, developing, and performance tuning data applications. Proven debugging and troubleshooting skills. Good to Have: Experience in GCPplatform development and deployment. Familiarity with Jira, Agile, Sonar, and GitHub More ❯
A specialist reinsurance broker is seeking a DataEngineer with GoogleCloudPlatform (GCP) experience to join their growing team in Central London. This is a hybrid role, requiring 2-3 days per week in the office. This is a modern, digitally native business with no legacy systems - a rare opportunity to work in a … joining a dynamic and fast-paced team that values innovation, adaptability, and continuous improvement. Key responsibilities will include: Lead and support the migration from Azure to GoogleCloudPlatform (GCP), including BigQuery Collaborate with stakeholders to understand data needs and ensure smooth transition and integration Become the internal GCP subject matter expert, with direct support from … legacy constraints Requirements : Proven experience as a Data Engineer. Hands-on experience with cloud migrations (on-prem to cloud or cloud-to-cloud). Strong knowledge of GCP, including BigQuery. Experience in Financial Services is desirable but not essential Exposure to AI technologies is a plus Excellent communication and stakeholder management skills Comfortable working in a dynamic More ❯
A specialist reinsurance broker is seeking a DataEngineer with GoogleCloudPlatform (GCP) experience to join their growing team in Central London. This is a hybrid role, requiring 2-3 days per week in the office. This is a modern, digitally native business with no legacy systems - a rare opportunity to work in a … joining a dynamic and fast-paced team that values innovation, adaptability, and continuous improvement. Key responsibilities will include: Lead and support the migration from Azure to GoogleCloudPlatform (GCP), including BigQuery Collaborate with stakeholders to understand data needs and ensure smooth transition and integration Become the internal GCP subject matter expert, with direct support from … legacy constraints Requirements : Proven experience as a Data Engineer. Hands-on experience with cloud migrations (on-prem to cloud or cloud-to-cloud). Strong knowledge of GCP, including BigQuery. Experience in Financial Services is desirable but not essential Exposure to AI technologies is a plus Excellent communication and stakeholder management skills Comfortable working in a dynamic More ❯
Social network you want to login/join with: GCPDataEngineer, london (city of london) col-narrow-left Client: Location: london (city of london), United Kingdom Job Category: Other - EU work permit required: Yes col-narrow … right Job Views: 3 Posted: 16.06.2025 Expiry Date: 31.07.2025 col-wide Job Description: A specialist reinsurance broker is seeking a DataEngineer with GoogleCloudPlatform (GCP) experience to join their growing team in Central London. This is a hybrid role, requiring 2–3 days per week in their modern office space. This is a … a dynamic and fast-paced team that values innovation, adaptability, and continuous improvement. Key responsibilities will include: Key Responsibilities: Lead and support the migration from Azure to GoogleCloudPlatform (GCP), including BigQuery Collaborate with stakeholders to understand data needs and ensure smooth transition and integration Become the internal GCP subject matter expert, with direct support More ❯
Social network you want to login/join with: A specialist reinsurance broker is seeking a DataEngineer with GoogleCloudPlatform (GCP) experience to join their growing team in Central London. This is a hybrid role, requiring 2–3 days per week in their modern office space. This is a modern, digitally native business … dynamic and fast-paced team that values innovation, adaptability, and continuous improvement. Key responsibilities will include: Key Responsibilities: Lead and support the migration from Azure to GoogleCloudPlatform (GCP), including BigQuery Collaborate with stakeholders to understand data needs and ensure smooth transition and integration Become the internal GCP subject matter expert, with direct support from … minimal legacy constraints Requirements : Proven experience as a Data Engineer. Hands-on experience with cloud migrations (on-prem to cloud or cloud-to-cloud) Strong knowledge of GCP, including BigQuery. Experience in Financial Services is desirable but not essential Exposure to AI technologies is a plus Excellent communication and stakeholder management skills Comfortable working in a dynamic More ❯
and experience — talk with your recruiter to learn more. Base pay range A specialist reinsurance broker is seeking a DataEngineer with GoogleCloudPlatform (GCP) experience to join their growing team in Central London. This is a hybrid role, requiring 2–3 days per week in their modern office space. This is a modern … dynamic and fast-paced team that values innovation, adaptability, and continuous improvement. Key responsibilities will include: Key Responsibilities: Lead and support the migration from Azure to GoogleCloudPlatform (GCP), including BigQuery Collaborate with stakeholders to understand data needs and ensure smooth transition and integration Become the internal GCP subject matter expert, with direct support from … minimal legacy constraints Requirements : Proven experience as a Data Engineer. Hands-on experience with cloud migrations (on-prem to cloud or cloud-to-cloud) Strong knowledge of GCP, including BigQuery. Experience in Financial Services is desirable but not essential Exposure to AI technologies is a plus Excellent communication and stakeholder management skills Comfortable working in a dynamic More ❯