Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering … Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering: Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data … technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems More ❯
Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering … Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering: Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data … technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems More ❯
optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. … Qualifications Bachelor's / Master's degree in Computer Science, Information Technology or equivalent degree subject. Strong proficiency in Python. Confident in using Linux system and tools. Excellent problem-solving abilities and capacity to work autonomously and adapt to a flexible, evolving environment. Ability to adapt to a flexible More ❯