Permanent Cloud Composer Job Vacancies

1 to 25 of 39 Permanent Cloud Composer Jobs

Senior GCP Data Engineer

London, England, United Kingdom
Stack
Development: Design, develop, and maintain robust and scalable backend systems and APIs Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load into Google Cloud environments Data Transformation: Implement data transformation processes including cleansing, normalization, and aggregation to ensure data quality and consistency Data Modeling: Develop and maintain data models and schemas to … support efficient data storage and retrieval on Google Cloud platforms Data Integration: Integrate data from multiple sources (on-prem and cloud-based) using Cloud Composer or other tools Data Lakes: Build data lakes using Google Cloud services such as BigQuery Performance Optimization: Optimize data pipelines and queries for improved performance … Experience Experience with data analytics and Big Data technologies Knowledge of cloud security best practices and compliance standards Experience with agile development methodologies GCP Certifications (e.g., Google Cloud Certified Professional Cloud Developer) Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job function Information Technology Industries Software Development Referrals More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Noir
Engineer, you’ll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You’ll work across a modern tech stack with a strong focus on Google Cloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and compliance with governance standards. More ❯
Posted:

Data Reliability Engineer (Data SRE)

Belfast, Northern Ireland, United Kingdom
CME Group
Description Data System Reliability Engineer (dSRE) Role Overview: A crucial role in CME's Cloud data transformation, the data SRE will be aligned to data product pods ensuring the our data infrastructure is reliable, scalable, and efficient as the GCP data footprint expands rapidly. Accountabilities: Automate data tasks on GCP Work with data domain owners, data scientists and … collaboration skills to work effectively in a team-oriented environment. Ideally a background in cloud computing and data-Intensive applications and services, with a focus on Google Cloud Platform 3+ years of experience in data engineering or data science. Experience with data quality assurance and testing. Ideally knowledge of GCP data services (BigQuery; Dataflow; Data Fusion … Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage) Understanding of logging and monitoring using tools such as Cloud Logging, ELK Stack, AppDynamics, New Relic, Splunk, etc. Proven ability to learn new technologies, including open source and cloud-native offerings Knowledge of AI and ML tools is a plus Google More ❯
Posted:

Data Engineer - Leading Fashion Company - London

London, England, United Kingdom
Hybrid / WFH Options
Noir
Engineer, you'll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You'll work across a modern tech stack with a strong focus on Google Cloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and More ❯
Posted:

Senior DevOps Engineer

London, England, United Kingdom
Hybrid / WFH Options
BI:PROCSI
Remote-first, with 2 days at the office per month (Oxford Street, London) Overview We are seeking a highly experienced DevOps Engineer with a strong background in Google Cloud Platform (GCP) and a proven track record in delivering complex data analytics projects for clients. In this full-time, permanent role, you will be responsible for designing, implementing, and … deployment processes that drive successful client engagements. You will work as part of a consultancy team, ensuring that each client engagement benefits from a robust, scalable, and secure cloud environment. Responsibilities Design and implement scalable, reliable GCP infrastructures tailored to each client's unique project requirements, ensuring high performance, availability, and security. Work closely with client stakeholders, full … to ensure successful project delivery, adhering to client timelines and quality standards. Implement and manage real-time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, Google Cloud Dataproc) in line with project needs. Build and maintain robust monitoring, logging, and alerting systems for client projects, ensuring system health and performance are continuously optimised and cost-efficient. More ❯
Posted:

GCP Cloud Architect

London, UK
Infoplus Technologies UK Limited
Role: Cloud Data Architect Location: London, UK Duration: Contract Job Description: Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batch processing pipelines for complex use … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

GCP Cloud Architect

City of London, London, United Kingdom
Infoplus Technologies UK Limited
Role: Cloud Data Architect Location: London, UK Duration: Contract Job Description: Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batch processing pipelines for complex use … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Cloud Architect

City of London, London, United Kingdom
iXceed Solutions
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Cloud Architect

London Area, United Kingdom
iXceed Solutions
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

Cloud Architect

South East London, England, United Kingdom
iXceed Solutions
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define … degree in Computer Science, Data Engineering, or related technical field. 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience … with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala More ❯
Posted:

LEAD SRE

Southlake, Texas, United States
Hybrid / WFH Options
Futran Tech Solutions Pvt. Ltd
colleagues, and communities thrive in an ever-changing world. For additional information, visit . Job Description Location - SouthLake , TX ( Hybrid Role ) Job Type - Full Time Position An experienced Cloud Lead (GCP, AWS, Azure) who would be able to review process and resolve technical problems over cloud. Should have expertise to act as a cloud consultant for … applications migrating over cloud and planning disaster recovery mechanisms. Experienced over designing and implementing IAC using Terraform and suggest better approach to maintain the infrastructure. Skills: Must to have: GKE, Compute, BigQuery, Cloud Run, Composer, Dataflow, Storage, Cloud SQL, IAM, Network, Cloud Build, IaC: Terraform, CSP (AWS, GCP, Azure), Terraform … GKE (Google Kubernetes Engine) and Compute Engine . Implement and maintain Infrastructure as Code (IaC) using Terraform . Data Engineering & Analytics : Build and optimize data pipelines using Dataflow , Cloud Composer , and BigQuery . Manage data storage and access using Cloud Storage and Cloud SQL . Application Deployment : Deploy and manage containerized applications More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

LEAD SRE

Southlake, Texas, United States
Hybrid / WFH Options
Wipro
colleagues, and communities thrive in an ever-changing world. For additional information, visit . Job Description Location - SouthLake , TX ( Hybrid Role ) Job Type - Full Time Position An experienced Cloud Lead (GCP, AWS, Azure) who would be able to review process and resolve technical problems over cloud. Should have expertise to act as a cloud consultant for … applications migrating over cloud and planning disaster recovery mechanisms. Experienced over designing and implementing IAC using Terraform and suggest better approach to maintain the infrastructure. Skills: Must to have: GKE, Compute, BigQuery, Cloud Run, Composer, Dataflow, Storage, Cloud SQL, IAM, Network, Cloud Build, IaC: Terraform, CSP (AWS, GCP, Azure), Terraform … GKE (Google Kubernetes Engine) and Compute Engine . Implement and maintain Infrastructure as Code (IaC) using Terraform . Data Engineering & Analytics : Build and optimize data pipelines using Dataflow , Cloud Composer , and BigQuery . Manage data storage and access using Cloud Storage and Cloud SQL . Application Deployment : Deploy and manage containerized applications More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Lead Software Engineer

Bristol, England, United Kingdom
Hybrid / WFH Options
Lloyds Banking Group
strong focus on automation and self service we make data easy and safe to use. We are a large enterprise building a cutting-edge data mesh architecture on Google Cloud Platform (GCP) to enable decentralised, self-serve data products across teams. We're crafting a golden path for development by integrating with GCP APIs and leveraging its robust … engineering projects within large, distributed systems. Strong programming skills in languages such as Python, Java, or Go, with a focus on building scalable, production-grade solutions. Experience with cloud platforms (AWS, GCP or Azure) Hands-on knowledge of API development, automation, and Infrastructure as Code (IaC) tools (e.g., Terraform). Proven experience in containerisation and orchestration (e.g., Kubernetes … experience of these would be useful: Experience with data mesh concepts (e.g., domain-driven ownership, and data product thinking). Expertise in GCP services, including BigQuery, Data Fusion, Cloud Composer, IAM, and others. Ability to define objectives and measure metrics. Relevant certifications in software engineering, cloud platforms or related areas. About working for us More ❯
Posted:

Analytics Engineer

Rotterdam, Zuid-Holland, Netherlands
Crystalloids - Google Cloud Premier Partner
founded in 2006 and consists of qualified software and data specialists that help organizations innovate and grow their business by building data-driven solutions. We are a Premier Google Cloud partner. We believe that the foundation for a successful solution is based on three dimensions: the people, the processes and the technology. At Crystalloids we always take these … analytics. This means not only transforming raw data into meaningful insights using tools like dbt, Python, and SQL , but also contributing to core data engineering tasks within the Google Cloud Platform (GCP) ecosystem. Are you passionate about creating well-modeled, tested, and documented datasets that drive business value? Do you enjoy collaborating with both technical and business stakeholders … Experience: Over 3 years of hands-on experience and proficiency in Data Modeling, Data Engineering, and Data Analytics. Cloud Expertise: H ands-on experience with the Google Cloud Platform (GCP) ecosystem Proficiency in Core Tools : dbt, BigQuery, Cloud Storage, Cloud Run, Cloud Composer/Apache Airflow, Dataform, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Senior Site Reliability Engineer (Remote)

United, Pennsylvania, United States
Hybrid / WFH Options
Experian
You will divide your time between system operations responsibilities, developing software and tools that help increase system reliability and performance, and leading projects to improve the health of Cloud operations. You will collaborate with software engineering teams and use similar technologies related to their software deliverable's design, deployment, and continued operations. You will report to the Manager … managers and other partners. Mentor teammates and lead technical interviews. Develop infrastructure abstraction layers to empower engineering teams. Tech Stack You'll Work With: Cloud Platforms: Google Cloud Platform (GCP), Amazon Web Services (AWS) Infrastructure as Code: Terraform, Atlantis CI/CD & Artifact Management: GitHub Actions, Harness, Nexus Containerization & Orchestration: Kubernetes, Helm Workflow Orchestration: Airflow, Cloud Composer, GKE Data & Messaging: BigQuery, CloudSQL, Pub/Sub ML Pipelines & Serverless: Kubeflow Pipelines, Cloud Run, Cloud Functions Monitoring & Visualization: Google Cloud Logging (Stackdriver), Looker Languages: Python, Golang, Scala 5+ years in a cloud-based infrastructure role with development and automation experience, at least 2 years with GCP More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Senior Data Engineer

London, United Kingdom
Hybrid / WFH Options
CreateFuture
you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers … talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

Edinburgh, Scotland, United Kingdom
Hybrid / WFH Options
CreateFuture
you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers … talk to you if: You’ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You’re comfortable working across cloud platforms – especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data More ❯
Posted:

Senior Software Engineer

Manchester, England, United Kingdom
Hybrid / WFH Options
Lloyds Banking Group
need: Proven experience in software engineering. Strong programming skills in languages such as Python, Java or Go with a focus on building scalable, production-grade solutions. Experience with cloud platforms (AWS, GCP, or Azure). Hands-on knowledge of API development, automation, and Infrastructure as Code (IaC) tools (e.g., Terraform). Experience in containerisation and orchestration (e.g. Kubernetes … with ITIL processes and incident management. Ability to triage and prioritise incidents based on severity, ensuring timely resolution. Any experience of these would be really useful: Expertise in Google Cloud Platform (GCP) services, including BigQuery, Data Fusion, Cloud Composer, IAM, and others. Hands-on experience with Change Data Capture (CDC) architectural patterns and tools More ❯
Posted:

Senior Software Engineer

Manchester, England, United Kingdom
Lloyds Banking Group
need: Proven experience in software engineering. Strong programming skills in languages such as Python, Java or Go with a focus on building scalable, production-grade solutions. Experience with cloud platforms (AWS, GCP, or Azure). Hands-on knowledge of API development, automation, and Infrastructure as Code (IaC) tools (e.g., Terraform). Experience in containerisation and orchestration (e.g. Kubernetes … with ITIL processes and incident management. Ability to triage and prioritise incidents based on severity, ensuring timely resolution. Any experience of these would be really useful: Expertise in Google Cloud Platform (GCP) services, including BigQuery, Data Fusion, Cloud Composer, IAM, and others. Hands-on experience with Change Data Capture (CDC) architectural patterns and tools More ❯
Posted:

Software Developer

Zuid-Holland, Netherlands
Crystalloids - Google Cloud Premier Partner
and innovate since 2006. Our team of passionate software and data specialists is dedicated to building smart, data-driven solutions that make a real difference. As a Premier Google Cloud Partner, we use the latest technology to help our customers solve challenges and unlock new opportunities. We believe that the foundation for a successful solution is based on … Cloud Run, Firestore, Apigee, Cloud Composer/Apache Airflow, Pub/Sub, Dataform (deincludbt), and Terraform (infrastructure as code). Familiarity with Google Cloud Services : AppEngine, Endpoints, BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Datastore. Programming languages: Java/Python, SQL Methodologies : Experience with Agile development More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Software Developer

Rotterdam, Zuid-Holland, Netherlands
Crystalloids - Google Cloud Premier Partner
and innovate since 2006. Our team of passionate software and data specialists is dedicated to building smart, data-driven solutions that make a real difference. As a Premier Google Cloud Partner, we use the latest technology to help our customers solve challenges and unlock new opportunities. We believe that the foundation for a successful solution is based on … Cloud Run, Firestore, Apigee, Cloud Composer/Apache Airflow, Pub/Sub, Dataform (deincludbt), and Terraform (infrastructure as code). Familiarity with Google Cloud Services : AppEngine, Endpoints, BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Datastore. Programming languages: Java/Python, SQL Methodologies : Experience with Agile development More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
SODA
sources that build our AI/ML user experience and help run our client robustly and at scale. Responsibilities: Build and maintain scalable data pipelines for BigQuery using Cloud Composer and Airflow; Provide observability and monitoring using Monte Carlo and Looker as well as operational tools such as NewRelic and Splunk, driving reliability, data quality, data More ❯
Posted:

Head of Innovation

Zuid-Holland, Netherlands
Crystalloids - Google Cloud Premier Partner
About Crystalloids Founded in 2006, Crystalloids is a Premier Google Cloud Partner specialized in building data-driven, cloud-native solutions. Our talented team of software and data specialists empowers leading organizations such as Rituals, Swiss Sense, FD Mediagroep, and Body & Fit to grow and innovate using the power of cloud technology. At Crystalloids, we … Innovation Management & Delivery Build frameworks for ideation, experimentation, and validation. Facilitate idea generation across departments and shepherd MVPs into operational success. Contribute technically to innovation projects, especially using Google Cloud technologies. Productize repeatable solutions and manage the full product lifecycle. Champion internal education and mentor tech leads on innovation best practices. Commercial Enablement Partner with Sales and Marketing … Experience and qualifications Education: Master's degree in Computer Science, Informatics, Software Engineering, or a related technical discipline. Cloud Expertise: Proven hands-on experience with the Google Cloud Platform (GCP) ecosystem; strong understanding of cloud-native solution patterns and modern data architectures. Product Management Experience: Over 7 years of experience in product management, solution More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Head of Innovation

Rotterdam, Zuid-Holland, Netherlands
Crystalloids - Google Cloud Premier Partner
About Crystalloids Founded in 2006, Crystalloids is a Premier Google Cloud Partner specialized in building data-driven, cloud-native solutions. Our talented team of software and data specialists empowers leading organizations such as Rituals, Swiss Sense, FD Mediagroep, and Body & Fit to grow and innovate using the power of cloud technology. At Crystalloids, we … Innovation Management & Delivery Build frameworks for ideation, experimentation, and validation. Facilitate idea generation across departments and shepherd MVPs into operational success. Contribute technically to innovation projects, especially using Google Cloud technologies. Productize repeatable solutions and manage the full product lifecycle. Champion internal education and mentor tech leads on innovation best practices. Commercial Enablement Partner with Sales and Marketing … Experience and qualifications Education: Master's degree in Computer Science, Informatics, Software Engineering, or a related technical discipline. Cloud Expertise: Proven hands-on experience with the Google Cloud Platform (GCP) ecosystem; strong understanding of cloud-native solution patterns and modern data architectures. Product Management Experience: Over 7 years of experience in product management, solution More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Software Engineering Manager, Data

Belfast, Northern Ireland, United Kingdom
Hybrid / WFH Options
aPriori Technologies
product and domain experts to build business intelligence platform tools and interfaces, leveraging your expertise to help customers extract meaningful insights from their data. Utilising dbt, BigQuery, Airflow, Cloud Composer and Looker within a multi-cloud (AWS + GCP) architecture, you will be responsible for driving cross-domain collaboration through a rapidly scalable data … candidate will have a strong technical background and proven experience managing software engineering teams with a solid understanding of data quality, and experience with one or more modern cloud data platforms. Location: Belfast (Hybrid- 2-3 days’ per week in office) Responsibilities Lead and manage the data engineering team through design, development and delivery of a modern data … s goals Requirements Proven ability to lead software engineering teams Proficiency in SQL and programming languages such as Python, Java or Scala Expertise leading teams and delivering in cloud data platforms (GCP, AWS) Familiarity with data modelling, warehousing, and database optimisation Strong understanding of data governance, security, and compliance frameworks (GDPR, ISO 27001, etc.) Education and Experience Bachelor More ❯
Posted:
Cloud Composer
10th Percentile
£71,337
25th Percentile
£102,420
Median
£131,666
75th Percentile
£142,438
90th Percentile
£146,300