with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the More ❯
have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow More ❯
with streaming and batch compute frameworks like Spring Kafka, Kafka Streams, Flink, Spark Streaming, Spark. Experience with large-scale computing platforms such as Hadoop, Hive, Spark, and NoSQL stores. Exposure to UI development of visualisations in modern web apps (e.g. Streamlit, Retool) is nice to have. Experience with AWS More ❯
Disaster recovery process/tools Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, Spark Streaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and More ❯
/Spring web application development. Experience with Test Driven Development and Agile methodologies; Behavior Driven Development is a plus. Knowledge of Hadoop, Big Data, Hive, Pig, NoSQL is a plus, though most engineers with this background may have limited REST experience. Additional Information All your information will be kept More ❯
proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one More ❯
in software architecture, object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL … following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for Apache Airflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and More ❯