performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , ApachePulsar , Apache Airflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with ApachePulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or … similar big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. More ❯
expertise in designing and deploying data architectures for high-velocity, high-throughput systems. Strong proficiency in real-time data streaming technologies such as Kafka, Pulsar, and RabbitMQ. Extensive experience with high-performance databases, including PostgreSQL, ClickHouse, Cassandra, and Redis. In-depth knowledge of ETL/ELT pipelines, data transformation More ❯
in live production systems. Experience with Messaging Systems: You have experience with distributed systems that use some form of messaging system (e.g. RabbitMQ, Kafka, Pulsar, etc). The role is focusing on RabbitMQ and you will have time to acquire deep knowledge in it. Programming Proficiency: You have some More ❯
and projects - and depending on your strengths and interests, you'll have the opportunity to move between them. Technologies we use Java, Kotlin, Kubernetes, Apache Kafka, GCP, BigQuery, Spark Our Culture While we're looking for professional skills, culture is just as important to us. We understand that everyone … Knowledge of data structures. Experience with Kubernetes or Docker. Preferred Qualifications Experience with at least one cloud platform. Experience with message brokers (Kafka, RabbitMQ, Pulsar, etc.). Experience in setting up data platforms and standards, not just pipelines. Experience with distributed data processing frameworks (e.g., Spark or Flink). More ❯