on the team to elevate technology and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like Ansible, Chef More ❯
development methodologies. Prior work on cloud computing platforms. Hands-on experience with other big data tools such as Oozie, YARN, Spark, SparkSQL, Flume, Sqoop2, Pig, Drill, Kafka, Elastic. Familiar with the financial services industry and/or regulatory environments. Able to demonstrate active participation in the big data, analytics More ❯
products and a good understanding of Digital Marketing and Marketing Technologies. Have experience working with Big Data technologies (such as Hadoop, MapReduce, Hive/Pig, Cassandra, MongoDB, etc) An understanding of web technologies such as Javascript, node.js and html. Some level of understanding or experience in AI/ML. More ❯
of Elasticsearch and understanding of the Hadoop ecosystem. Experience working with large datasets and distributed computing tools such as Map/Reduce, Hadoop, Hive, Pig, etc. Advanced skills in Excel for analytical purposes. An MSc or PhD in Data Science, an analytical subject (Physics, Mathematics, Computing), or other quantitative More ❯
search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use of Excel spread sheets for analytical purposes An MSc or PhD in Data Science or an analytical subject (Physics, Mathematics, Computing More ❯
Our team values continuous learning, knowledge sharing, and creating inclusive solutions that make a difference. Key Responsibilities Support customers with big data services including Apache Spark, Hive, Presto, and other Hadoop ecosystem components Develop and share technical solutions through various communication channels Contribute to improving support processes and customer … work week schedule, which may include weekends on rotation. BASIC QUALIFICATIONS - Good depth of understanding in Hadoop Administration, support and troubleshooting (Any two applications: Apache Spark, Apache Hive, Presto, Map-Reduce, Zookeeper, HBASE, HDFS and Pig.) - Good understanding of Linux and Networking concepts - Intermediate programming/scripting skills. More ❯