robust and scalable backend systems and APIs Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load into GoogleCloud environments Data Transformation: Implement data transformation processes including cleansing, normalization, and aggregation to ensure data quality and consistency Data Modeling: Develop and maintain … data models and schemas to support efficient data storage and retrieval on GoogleCloud platforms Data Integration: Integrate data from multiple sources (on-prem and cloud-based) using CloudComposer or other tools Data Lakes: Build data lakes using GoogleCloud … and Big Data technologies Knowledge of cloud security best practices and compliance standards Experience with agile development methodologies GCP Certifications (e.g., GoogleCloud Certified Professional Cloud Developer) Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job function Information More ❯
sector. In addition to a global, extensive and diverse energy data library, TGS offers specialized services such as advanced processing and analytics alongside cloud-based data applications and solutions. We are seeking a highly motivated and experienced Senior Cloud Platform Engineer to our Research & Technology - Architecture … team. In this position you will play a pivotal in driving and defining the technical vision and roadmap for the company's multi-cloud platform (GCP, AWS, and Azure). This position involves designing, building, and maintaining core cloud platform components, with a primary focus on … needs of the business and adheres to security and compliance standards. Required Competencies Advanced GCP Knowledge: Advanced knowledge and hands-on experience with GoogleCloud Platform (GCP) services. Deep Kubernetes Expertise: Advanced knowledge and experience with Kubernetes, specifically Google Kubernetes Engine (GKE). This includes deployment, management, scaling More ❯
team of passionate software and data specialists is dedicated to building smart, data-driven solutions that make a real difference. As a Premier GoogleCloud Partner, we use the latest technology to help our customers solve challenges and unlock new opportunities. We believe that the foundation for a … feedback. Experience and qualifications Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proficiency in Core Tools : Cloud Run, Firestore … Apigee, CloudComposer/Apache Airflow, Pub/Sub, Dataform (deincludbt), and Terraform (infrastructure as code). Familiarity with GoogleCloud Services : AppEngine, Endpoints, BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Datastore. Programming languages: Java/Python, SQL More ❯
team of passionate software and data specialists is dedicated to building smart, data-driven solutions that make a real difference. As a Premier GoogleCloud Partner, we use the latest technology to help our customers solve challenges and unlock new opportunities. We believe that the foundation for a … feedback. Experience and qualifications Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proficiency in Core Tools : Cloud Run, Firestore … Apigee, CloudComposer/Apache Airflow, Pub/Sub, Dataform (deincludbt), and Terraform (infrastructure as code). Familiarity with GoogleCloud Services : AppEngine, Endpoints, BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Datastore. Programming languages: Java/Python, SQL More ❯
London, England, United Kingdom Hybrid / WFH Options
Third Republic
Social network you want to login/join with: Technical Lead - GoogleCloud projects, London col-narrow-left Client: Third Republic Location: London, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Reference: bf8971f95d7c Job Views: 5 Posted: 02.06.2025 Expiry Date: 17.07.2025 col … Job Overview: As a GCP Technical Lead , you’ll play a pivotal role as a subject matter expert, driving innovative solutions on the GoogleCloud Platform. You’ll guide project outcomes, ensuring the successful delivery of key objectives while taking ownership of the technical direction. You’ll leverage … need a strong knowledge of the following, with deep expertise in one or more categories: VPC Networks and Service Perimeters Data Solutions : BigQuery, Cloud SQL, Cloud Spanner, Firestore Data Analytics Services : Dataproc, Dataflow, Data Fusion, CloudComposer Security : Cloud IAM More ❯
Job Description: Proficiency in programming languages such as Python, PySpark, and Java SparkSQL GCP BigQuery Version control tools (Git, GitHub), automated deployment tools GoogleCloud Platform services, Pub/Sub, BigQuery Streaming, and related technologies Deep understanding of real-time data processing and event-driven architectures Familiarity with … data orchestration tools like GoogleCloud Platform CloudComposerGoogleCloud Platform certifications are a strong advantage Develop, implement, and optimize real-time data processing workflows using GCP services such as Dataflow, Pub/Sub, and BigQuery Streaming 6 months initial, likely More ❯
Southend-on-Sea, England, United Kingdom Hybrid / WFH Options
TN United Kingdom
Social network you want to login/join with: col-narrow-left Client: Focus Cloud Location: Job Category: - EU work permit required: Yes col-narrow-right Job Reference: 00583e3e2ae6 Job Views: 2 Posted: 08.05.2025 Expiry Date: 22.06.2025 col-wide Job Description: Position: GCP Data Engineer Employment Type … -based data infrastructure. Key skills: 5+ years of experience in data engineering, with at least 1–2 years hands-on with GoogleCloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with … Rights to work in the UK is a must (No Sponsorship available) Responsibilities: Design, build, and maintain scalable and reliable data pipelines on GoogleCloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, Apache Beam, BigQuery, and CloudComposer Collaborate More ❯
Description Data System Reliability Engineer (dSRE) Role Overview: A crucial role in CME's Cloud data transformation, the data SRE will be aligned to data product pods ensuring that our data infrastructure is reliable, scalable, and efficient as the GCP data footprint expands rapidly. Accountabilities: Automate data tasks … infrastructure management Proficiency in data technologies, such as relational databases, data warehousing, big data platforms (e.g., Hadoop, Spark), data streaming (e.g., Kafka), and cloud services (e.g., AWS, GCP, Azure) Programming skills in Python, Java, or Scala, with automation and scripting experience Experience with containerization and orchestration tools like … skills with a proactive approach to issues Excellent communication and collaboration skills Background in cloud computing and data-intensive applications, especially GoogleCloud Platform 3+ years in data engineering or data science Experience with data quality assurance and testing Knowledge of GCP data services (BigQuery, Dataflow More ❯
Description Data System Reliability Engineer (dSRE) Role Overview: A crucial role in CME's Cloud data transformation, the data SRE will be aligned to data product pods ensuring the our data infrastructure is reliable, scalable, and efficient as the GCP data footprint expands rapidly. Accountabilities: Automate data tasks … in a team-oriented environment. Ideally a background in cloud computing and data-Intensive applications and services, with a focus on GoogleCloud Platform 3+ years of experience in data engineering or data science. Experience with data quality assurance and testing. Ideally knowledge of GCP data … services (BigQuery; Dataflow; Data Fusion; Dataproc; CloudComposer; Pub/Sub; GoogleCloud Storage) Understanding of logging and monitoring using tools such as Cloud Logging, ELK Stack, AppDynamics, New Relic, Splunk, etc. Proven ability to learn new technologies, including open source and More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
English Role: We are seeking an experienced GCP Data Architect to lead the design and development of cloud data architectures on GoogleCloud Platform. This role is crucial in transforming our client’s enterprise data landscape to support scalability, security, and innovation. As a Data Architect … Key Skills: 7+ years of experience in data engineering or data architecture roles 3+ years of hands-on experience architecting data solutions on GoogleCloud Platform Strong knowledge of BigQuery, Dataflow, CloudComposer, Pub/Sub, and Cloud Functions Proficient in data … with Infrastructure as Code (e.g., Terraform, Deployment Manager) Deep understanding of data governance, security, and compliance frameworks GCP Professional Data Engineer or Professional Cloud Architect certification Experience with hybrid or multi-cloud architectures Familiarity with streaming platforms like Apache Kafka or Google Pub/Sub Background More ❯
Southend-on-Sea, England, United Kingdom Hybrid / WFH Options
TN United Kingdom
Role We are seeking an experienced GCP Data Architect to lead the design and development of cloud-native data architectures on GoogleCloud Platform. This role is crucial in transforming our client’s enterprise data landscape to support scalability, security, and innovation. As a Data Architect … Key Skills 7+ years of experience in data engineering or data architecture roles 3+ years of hands-on experience architecting data solutions on GoogleCloud Platform Strong knowledge of BigQuery, Dataflow, CloudComposer, Pub/Sub, and Cloud Functions Proficient in data … with Infrastructure as Code (e.g., Terraform, Deployment Manager) Deep understanding of data governance, security, and compliance frameworks GCP Professional Data Engineer or Professional Cloud Architect certification Experience with hybrid or multi-cloud architectures Familiarity with streaming platforms like Apache Kafka or Google Pub/Sub Background More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
for building and maintaining scalable, efficient, and reliable data pipelines. You'll work across a modern tech stack with a strong focus on GoogleCloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain … robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: CloudComposer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Future
Apply testing principles to ensure code quality and purposefulness. Experience that will put you ahead of the curve: Experience using Python on GoogleCloud Platform for Big Data projects, including BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, CloudMore ❯
principles to ensure code quality and fit for purpose. Experience that will put you ahead of the curve Experience using Python on GoogleCloud Platform for Big Data projects, including BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, CloudMore ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
for building and maintaining scalable, efficient, and reliable data pipelines. You’ll work across a modern tech stack with a strong focus on GoogleCloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain … robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: CloudComposer (Apache Airflow) BigQuery Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
self service we make data easy and safe to use. We are a large enterprise building a cutting-edge data mesh architecture on GoogleCloud Platform (GCP) to enable decentralised, self-serve data products across teams. We're crafting a golden path for development by integrating with GCP … systems. Strong programming skills in languages such as Python, Java, or Go, with a focus on building scalable, production-grade solutions. Experience with cloud platforms (AWS, GCP or Azure) Hands-on knowledge of API development, automation, and Infrastructure as Code (IaC) tools (e.g., Terraform). Proven experience in … useful: Experience with data mesh concepts (e.g., domain-driven ownership, and data product thinking). Expertise in GCP services, including BigQuery, Data Fusion, CloudComposer, IAM, and others. Ability to define objectives and measure metrics. Relevant certifications in software engineering, cloud platforms or related More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or GoogleCloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or … ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, CloudComposer) You have a good More ❯
building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or GoogleCloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or … ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, CloudComposer) You have a good More ❯
is looking for a Backend Engineer . You will: Develop and support AI pipelines with computer vision ML-derived geospatial assets leveraging Kubernetes cloud services in GCP and Azure. Massive data ingestion pipelines to ensure a continuous and reliable flow of image/video/GIS data into … to data for internal teams and external consumers, ensuring high availability and security. Workflow Orchestration: Implement and manage data workflows using Apache Airflow\GoogleCloudComposer\Azure Data Factory to automate, schedule, and monitor data pipelines, ensuring efficient data processing and timely delivery. Data Storage and … data storage solutions. Experience with open-source and cloud-native databases and data stores such as Postgres, Bigtable, and GoogleCloud Storage. Data Integration: Work closely with data scientists, operational analysts, and other stakeholders to integrate data from various sources, including video/GIS More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
aPriori Technologies
build business intelligence platform tools and interfaces, leveraging your expertise to help customers extract meaningful insights from their data. Utilising dbt, BigQuery, Airflow, CloudComposer and Looker within a multi-cloud (AWS + GCP) architecture, you will be responsible for driving cross-domain collaboration … technical background and proven experience managing software engineering teams with a solid understanding of data quality, and experience with one or more modern cloud data platforms. Location: Belfast (Hybrid- 2-3 days’ per week in office) Responsibilities Lead and manage the data engineering team through design, development and … to lead software engineering teams Proficiency in SQL and programming languages such as Python, Java or Scala Expertise leading teams and delivering in cloud data platforms (GCP, AWS) Familiarity with data modelling, warehousing, and database optimisation Strong understanding of data governance, security, and compliance frameworks (GDPR, ISO More ❯
London, England, United Kingdom Hybrid / WFH Options
SODA
/ML user experience and help run our client robustly and at scale. Responsibilities: Build and maintain scalable data pipelines for BigQuery using CloudComposer and Airflow; Provide observability and monitoring using Monte Carlo and Looker as well as operational tools such as NewRelic and Splunk More ❯
Required Skills Hands-on experience with GCP services (BigQuery , Cloud Storage, Dataflow, Pub/Sub, etc.). Proficient in SQL and Python. Experience with data modeling, ETL tools, and workflow orchestration (e.g., Apache Airflow, CloudComposer). Familiarity with CI/CD practices and More ❯
Required Skills Hands-on experience with GCP services (BigQuery , Cloud Storage, Dataflow, Pub/Sub, etc.). Proficient in SQL and Python. Experience with data modeling, ETL tools, and workflow orchestration (e.g., Apache Airflow, CloudComposer). Familiarity with CI/CD practices and More ❯
Required Skills Hands-on experience with GCP services (BigQuery , Cloud Storage, Dataflow, Pub/Sub, etc.). Proficient in SQL and Python. Experience with data modeling, ETL tools, and workflow orchestration (e.g., Apache Airflow, CloudComposer). Familiarity with CI/CD practices and More ❯
work permit required: Yes Job Views: 2 Posted: 06.06.2025 Expiry Date: 21.07.2025 Job Description: Required Skills Hands-on experience with GCP services (BigQuery, Cloud Storage, Dataflow, Pub/Sub, etc.) . Proficient in SQL and Python . Experience with data modeling, ETL tools, and workflow orchestration (e.g., Apache … Airflow, CloudComposer). Familiarity with CI/CD practices and version control (e.g., Git). Good understanding of data governance and security best practices. #J-18808-Ljbffr More ❯