Cloud Composer Jobs in London

20 of 20 Cloud Composer Jobs in London

Senior GCP Data Engineer

London, England, United Kingdom
Stack
Development: Design, develop, and maintain robust and scalable backend systems and APIs Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load into Google Cloud environments Data Transformation: Implement data transformation processes including cleansing, normalization, and aggregation to ensure data quality and consistency Data Modeling: Develop and maintain data models and schemas to … support efficient data storage and retrieval on Google Cloud platforms Data Integration: Integrate data from multiple sources (on-prem and cloud-based) using Cloud Composer or other tools Data Lakes: Build data lakes using Google Cloud services such as BigQuery Performance Optimization: Optimize data pipelines and queries for improved performance … Experience Experience with data analytics and Big Data technologies Knowledge of cloud security best practices and compliance standards Experience with agile development methodologies GCP Certifications (e.g., Google Cloud Certified Professional Cloud Developer) Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job function Information Technology Industries Software Development Referrals More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Noir
Engineer, you’ll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You’ll work across a modern tech stack with a strong focus on Google Cloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and compliance with governance standards. More ❯
Posted:

Data Engineer - Leading Fashion Company - London

London, England, United Kingdom
Hybrid / WFH Options
Noir
Engineer, you'll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You'll work across a modern tech stack with a strong focus on Google Cloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and More ❯
Posted:

GCP Cloud Architect

London, UK
Infoplus Technologies UK Limited
Role: Cloud Data Architect Location: London, UK Duration: Contract Job Description: Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batch processing pipelines for complex use … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

GCP Cloud Architect

City of London, London, United Kingdom
Infoplus Technologies UK Limited
Role: Cloud Data Architect Location: London, UK Duration: Contract Job Description: Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batch processing pipelines for complex use … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Cloud Architect

London Area, United Kingdom
iXceed Solutions
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Cloud Architect

City of London, London, United Kingdom
iXceed Solutions
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Cloud Architect

South East London, England, United Kingdom
iXceed Solutions
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Senior DevOps Engineer London Office, Oxford St ·

London, England, United Kingdom
Hybrid / WFH Options
Biprocsi Ltd
Remote-first, with 2 days at the office per month (Oxford Street, London) Overview We are seeking a highly experienced DevOps Engineer with a strong background in Google Cloud Platform (GCP) and a proven track record in delivering complex data analytics projects for clients. In this full-time, permanent role, you will be responsible for designing, implementing, and … deployment processes that drive successful client engagements. You will work as part of a consultancy team, ensuring that each client engagement benefits from a robust, scalable, and secure cloud environment. Responsibilities Design and implement scalable, reliable GCP infrastructures tailored to each client's unique project requirements, ensuring high performance, availability, and security. Work closely with client stakeholders, full … to ensure successful project delivery, adhering to client timelines and quality standards. Implement and manage real-time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, Google Cloud Dataproc) in line with project needs. Build and maintain robust monitoring, logging, and alerting systems for client projects, ensuring system health and performance are continuously optimised and cost-efficient. More ❯
Posted:

Senior Data Engineer

London, United Kingdom
Hybrid / WFH Options
CreateFuture
you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers … talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
SODA
sources that build our AI/ML user experience and help run our client robustly and at scale. Responsibilities: Build and maintain scalable data pipelines for BigQuery using Cloud Composer and Airflow; Provide observability and monitoring using Monte Carlo and Looker as well as operational tools such as NewRelic and Splunk, driving reliability, data quality, data More ❯
Posted:

Backend Engineer

London, England, United Kingdom
BENTLEY SYSTEMS, INC
The team at Bentley Systems is looking for a Backend Engineer . You will: Develop and support AI pipelines with computer vision ML-derived geospatial assets leveraging Kubernetes cloud services in GCP and Azure. Massive data ingestion pipelines to ensure a continuous and reliable flow of image/video/GIS data into cloud-based datastores. … FastAPI) to enable seamless access to data for internal teams and external consumers, ensuring high availability and security. Workflow Orchestration: Implement and manage data workflows using Apache Airflow\Google Cloud Composer\Azure Data Factory to automate, schedule, and monitor data pipelines, ensuring efficient data processing and timely delivery. Data Storage and Management: Architect and manage scalable … and efficient cloud data storage solutions. Experience with open-source and cloud-native databases and data stores such as Postgres, Bigtable, and Google Cloud Storage. Data Integration: Work closely with data scientists, operational analysts, and other stakeholders to integrate data from various sources, including video/GIS, ensuring data consistency, quality, and reliability. Collaboration More ❯
Posted:

Software Engineer - C++

London, England, United Kingdom
Quality Control Specialist - Pest Control
Preferred Qualifications * Designing and implementing real-time pipelines. * Designing and implementing data pipelines for CV/ML systems. * Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying massive datasets using Spark, Presto, Hive, Impala More ❯
Posted:

Technical Lead - Google Cloud projects

London, England, United Kingdom
Hybrid / WFH Options
Third Republic
Social network you want to login/join with: Technical Lead - Google Cloud projects, London col-narrow-left Client: Third Republic Location: London, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Reference: bf8971f95d7c Job Views: 5 Posted: 02.06.2025 Expiry Date: 17.07.2025 col-wide Job Description: Position: Can offer fully Remote or London … to 30%) Company: Leading Consultancy Job Overview: As a GCP Technical Lead , you’ll play a pivotal role as a subject matter expert, driving innovative solutions on the Google Cloud Platform. You’ll guide project outcomes, ensuring the successful delivery of key objectives while taking ownership of the technical direction. You’ll leverage your architectural expertise, confidently leading … in this role, you’ll need a strong knowledge of the following, with deep expertise in one or more categories: VPC Networks and Service Perimeters Data Solutions : BigQuery, Cloud SQL, Cloud Spanner, Firestore Data Analytics Services : Dataproc, Dataflow, Data Fusion, Cloud Composer Security : Cloud IAM, PAM, VPC Service Controls, Security More ❯
Posted:

GCP Data Engineer (Java, Spark, ETL)

City of London, England, United Kingdom
JR United Kingdom
Posted: 16.06.2025 Expiry Date: 31.07.2025 Job Description: Proficiency in programming languages such as Python, PySpark, and Java SparkSQL GCP BigQuery Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform services, Pub/Sub, BigQuery Streaming, and related technologies Deep understanding of real-time data processing and event-driven architectures Familiarity with data orchestration tools like GoogleCloud Platform Cloud Composer Google Cloud Platform certification(s) is a strong advantage Develop, implement, and optimize real-time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming 6 months initial, likely long-term extensions This advert was posted by Staffworx Limited More ❯
Posted:

GCP Data Engineer (Java, Spark, ETL)

London, UK
Staffworx
Future Talent Pool - GCP Data Engineer, London, hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of real-time data processing and event-driven architectures. Familiarity with data orchestration tools Google Cloud Platform cloud composer. Google Cloud Platform certification(s) is a strong advantage. Develop, implement, and optimize real-time data processing workflows using Google Cloud More ❯
Posted:

GCP Data Engineer (Java, Spark, ETL)

City of London, Greater London, UK
Staffworx
Future Talent Pool - GCP Data Engineer, London, hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of real-time data processing and event-driven architectures. Familiarity with data orchestration tools Google Cloud Platform cloud composer. Google Cloud Platform certification(s) is a strong advantage. Develop, implement, and optimize real-time data processing workflows using Google Cloud More ❯
Posted:

GCP Data Engineer (Java, Spark, ETL)

London, United Kingdom
Hybrid / WFH Options
Staffworx Limited
GCP Data Engineer (Java, Spark, ETL) Future Talent Pool, GCP Data Engineer, London, hybrid role new workstreams on digital Google Cloud transformation programme Proficiency in programming languages such as Python and Java Programming languages Pyspark & Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/… Linux Platform Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of Real Time data processing and event-driven architectures. Familiarity with data orchestration tools Google Cloud Platform cloud composer. Google Cloud Platform certification(s) is a strong advantage. … Develop, implement, and optimize Real Time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely long term extensions This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an More ❯
Employment Type: Contract, Work From Home
Posted:

GCP Data Engineer (Java, Spark, ETL)

London, England, United Kingdom
Hybrid / WFH Options
Highnic
for the GCP Data Engineer (Java, Spark, ETL) role at Good Chemical Science & Technology Co. Ltd. Responsibilities Develop, implement, and optimize Real Time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. Design and develop ETL processes for data ingestion and preparation. Work with GCP services including BigQuery, Dataflow, Pub …/Sub, Cloud Storage, and Cloud Run. Utilize programming languages such as Python, Java, and Pyspark. Use version control tools (Git, GitHub) and automated deployment tools. Apply knowledge of data orchestration tools like Google Cloud Platform Cloud Composer. Possibly obtain and leverage Google Cloud Platform certifications. Qualifications Proficiency in programming … data processing. Understanding of event-driven architectures. Familiarity with Unix/Linux platforms. Deep understanding of real-time data processing and event-driven architectures. Strong knowledge of Google Cloud Platform services and tools. Google Cloud Platform certification(s) is a strong advantage. Additional Information Employment type: Full-time Seniority level: Entry level Job function: Information Technology More ❯
Posted:

GCP Data Engineer (Java, Spark, ETL)

London, England, United Kingdom
JR United Kingdom
Date: 15.07.2025 col-wide Job Description: Proficiency in programming languages such as Python, PySpark and Java SparkSQL GCP BigQuery Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of real-time data processing and event-driven architectures. Familiarity with data orchestration tools Google Cloud Platform cloud composer. Google Cloud Platform certification(s) is a strong advantage. Develop, implement, and optimize real-time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely long term extensions This advert was posted by Staffworx Limited - a UK based More ❯
Posted:
Cloud Composer
London
25th Percentile
£82,500
Median
£85,000
75th Percentile
£87,500