The Site Reliability Engineering (SRE) team at Pendo is responsible for provisioning and maintaining cloud infrastructure from development through production for all product initiatives, and working with developers and product managers to ensure that our products are not only reliable and performant, but also cost-efficient. Our platform … is built on Google Kubernetes Engine (GKE) and utilizes several other Google technologies such as Memorystore, Cloud Datastore, PubSub, CloudFunctions, BigQuery, and Vertex AI, as well as services from other vendors such as Amazon SES. In the development process, SREs provide developers with … pipelines and development environments to facilitate frequent delivery of new product features. In production, SREs perform Tier 1 on-call and incident management functions, supporting a high-throughput platform which processes more than 15 billion events per day. To ensure the reliability of this environment for our customers More ❯
Required Qualifications: 12+ years of experience in data architecture, cloud computing and real-time data processing. Hands-on experience with Apache Kafka (Confluent), Cassandra etc and related technologies. Strong expertise in GCP. Realtime services experience using GCP services like Pub/Sub, CloudFunctions, Datastore, and Cloud Spanner. Experience with message queues (e.g., RabbitMQ) and event-driven patterns. Hands-on experience with data serialization formats (e.g., Avro, Parquet, JSON) and schema registries. Strong understanding of DevOps and CI/CD pipelines for data streaming solutions. Familiarity with containerization and orchestration More ❯
Runcorn, Cheshire, North West, United Kingdom Hybrid / WFH Options
Forward Role
their products + much more Key Responsibilities Develop and maintain ETL/ELT data pipelines using Python and SQL. Work with enterprise-level cloud platforms, ideally GCP (BigQuery, Airflow, CloudFunctions). Integrate APIs and process data from multiple sources. Design and optimise data … dashboards using tools like Tableau or Power BI. Required Skills Strong experience with Python and SQL for data manipulation and automation. Expertise in cloud platforms, particularly GCP or similar enterprise-level cloud environments (AWS/Azure) Solid experience with data warehousing and ETL/ELT processes. More ❯
Required Qualifications: 12+ years of experience in data architecture, cloud computing and real-time data processing. Hands-on experience with Apache Kafka (Confluent), Cassandra etc and related technologies. Strong expertise in GCP. Realtime services experience using GCP services like Pub/Sub, CloudFunctions, Datastore, and Cloud Spanner click apply for full job details More ❯