London, England, United Kingdom Hybrid / WFH Options
Noir
Engineer, you’ll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You’ll work across a modern tech stack with a strong focus on GoogleCloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: CloudComposer (Apache Airflow) BigQuery Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and compliance with governance standards. More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Engineer, you'll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You'll work across a modern tech stack with a strong focus on GoogleCloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: CloudComposer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and More ❯
London, England, United Kingdom Hybrid / WFH Options
BI:PROCSI
Remote-first, with 2 days at the office per month (Oxford Street, London) Overview We are seeking a highly experienced DevOps Engineer with a strong background in GoogleCloud Platform (GCP) and a proven track record in delivering complex data analytics projects for clients. In this full-time, permanent role, you will be responsible for designing, implementing, and … deployment processes that drive successful client engagements. You will work as part of a consultancy team, ensuring that each client engagement benefits from a robust, scalable, and secure cloud environment. Responsibilities Design and implement scalable, reliable GCP infrastructures tailored to each client's unique project requirements, ensuring high performance, availability, and security. Work closely with client stakeholders, full … to ensure successful project delivery, adhering to client timelines and quality standards. Implement and manage real-time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, GoogleCloud Dataproc) in line with project needs. Build and maintain robust monitoring, logging, and alerting systems for client projects, ensuring system health and performance are continuously optimised and cost-efficient. More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
strong focus on automation and self service we make data easy and safe to use. We are a large enterprise building a cutting-edge data mesh architecture on GoogleCloud Platform (GCP) to enable decentralised, self-serve data products across teams. We're crafting a golden path for development by integrating with GCP APIs and leveraging its robust … engineering projects within large, distributed systems. Strong programming skills in languages such as Python, Java, or Go, with a focus on building scalable, production-grade solutions. Experience with cloud platforms (AWS, GCP or Azure) Hands-on knowledge of API development, automation, and Infrastructure as Code (IaC) tools (e.g., Terraform). Proven experience in containerisation and orchestration (e.g., Kubernetes … experience of these would be useful: Experience with data mesh concepts (e.g., domain-driven ownership, and data product thinking). Expertise in GCP services, including BigQuery, Data Fusion, CloudComposer, IAM, and others. Ability to define objectives and measure metrics. Relevant certifications in software engineering, cloud platforms or related areas. About working for us More ❯
you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or GoogleCloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers … talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, CloudComposer) You have a good understanding of data modelling, data More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or GoogleCloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers … talk to you if: You’ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You’re comfortable working across cloud platforms – especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, CloudComposer) You have a good understanding of data modelling, data More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
need: Proven experience in software engineering. Strong programming skills in languages such as Python, Java or Go with a focus on building scalable, production-grade solutions. Experience with cloud platforms (AWS, GCP, or Azure). Hands-on knowledge of API development, automation, and Infrastructure as Code (IaC) tools (e.g., Terraform). Experience in containerisation and orchestration (e.g. Kubernetes … with ITIL processes and incident management. Ability to triage and prioritise incidents based on severity, ensuring timely resolution. Any experience of these would be really useful: Expertise in GoogleCloud Platform (GCP) services, including BigQuery, Data Fusion, CloudComposer, IAM, and others. Hands-on experience with Change Data Capture (CDC) architectural patterns and tools More ❯
London, England, United Kingdom Hybrid / WFH Options
SODA
sources that build our AI/ML user experience and help run our client robustly and at scale. Responsibilities: Build and maintain scalable data pipelines for BigQuery using CloudComposer and Airflow; Provide observability and monitoring using Monte Carlo and Looker as well as operational tools such as NewRelic and Splunk, driving reliability, data quality, data More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
aPriori Technologies
product and domain experts to build business intelligence platform tools and interfaces, leveraging your expertise to help customers extract meaningful insights from their data. Utilising dbt, BigQuery, Airflow, CloudComposer and Looker within a multi-cloud (AWS + GCP) architecture, you will be responsible for driving cross-domain collaboration through a rapidly scalable data … candidate will have a strong technical background and proven experience managing software engineering teams with a solid understanding of data quality, and experience with one or more modern cloud data platforms. Location: Belfast (Hybrid- 2-3 days’ per week in office) Responsibilities Lead and manage the data engineering team through design, development and delivery of a modern data … s goals Requirements Proven ability to lead software engineering teams Proficiency in SQL and programming languages such as Python, Java or Scala Expertise leading teams and delivering in cloud data platforms (GCP, AWS) Familiarity with data modelling, warehousing, and database optimisation Strong understanding of data governance, security, and compliance frameworks (GDPR, ISO 27001, etc.) Education and Experience Bachelor More ❯
London, England, United Kingdom Hybrid / WFH Options
Third Republic
Social network you want to login/join with: Technical Lead - GoogleCloud projects, London col-narrow-left Client: Third Republic Location: London, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Reference: bf8971f95d7c Job Views: 5 Posted: 02.06.2025 Expiry Date: 17.07.2025 col-wide Job Description: Position: Can offer fully Remote or London … to 30%) Company: Leading Consultancy Job Overview: As a GCP Technical Lead , you’ll play a pivotal role as a subject matter expert, driving innovative solutions on the GoogleCloud Platform. You’ll guide project outcomes, ensuring the successful delivery of key objectives while taking ownership of the technical direction. You’ll leverage your architectural expertise, confidently leading … in this role, you’ll need a strong knowledge of the following, with deep expertise in one or more categories: VPC Networks and Service Perimeters Data Solutions : BigQuery, Cloud SQL, Cloud Spanner, Firestore Data Analytics Services : Dataproc, Dataflow, Data Fusion, CloudComposer Security : Cloud IAM, PAM, VPC Service Controls, Security More ❯
GCP Data Engineer (Java, Spark, ETL) Future Talent Pool, GCP Data Engineer, London, hybrid role new workstreams on digital GoogleCloud transformation programme Proficiency in programming languages such as Python and Java Programming languages Pyspark & Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery GoogleCloud Platform Data Studio Unix/… Linux Platform Version control tools (Git, GitHub), automated deployment tools GoogleCloud Platform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of Real Time data processing and event-driven architectures. Familiarity with data orchestration tools GoogleCloud Platform cloud composer. GoogleCloud Platform certification(s) is a strong advantage. … Develop, implement, and optimize Real Time data processing workflows using GoogleCloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely long term extensions This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an More ❯
London, England, United Kingdom Hybrid / WFH Options
Highnic
for the GCP Data Engineer (Java, Spark, ETL) role at Good Chemical Science & Technology Co. Ltd. Responsibilities Develop, implement, and optimize Real Time data processing workflows using GoogleCloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. Design and develop ETL processes for data ingestion and preparation. Work with GCP services including BigQuery, Dataflow, Pub …/Sub, Cloud Storage, and Cloud Run. Utilize programming languages such as Python, Java, and Pyspark. Use version control tools (Git, GitHub) and automated deployment tools. Apply knowledge of data orchestration tools like GoogleCloud Platform Cloud Composer. Possibly obtain and leverage GoogleCloud Platform certifications. Qualifications Proficiency in programming … data processing. Understanding of event-driven architectures. Familiarity with Unix/Linux platforms. Deep understanding of real-time data processing and event-driven architectures. Strong knowledge of GoogleCloud Platform services and tools. GoogleCloud Platform certification(s) is a strong advantage. Additional Information Employment type: Full-time Seniority level: Entry level Job function: Information Technology More ❯