Hadoop Distributed File System (HDFS) Jobs

3 Hadoop Distributed File System (HDFS) Jobs

Appian Software Engineer

Chicago, Illinois, United States
Hybrid / WFH Options
Request Technology - Robyn Honquest
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph more »
Employment Type: Permanent
Salary: USD 145,000 Annual
Posted:

Associate Principal, Appian Development

Dallas, Texas, United States
Request Technology
solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld … required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI more »
Employment Type: Permanent
Salary: USD 150,000 Annual
Posted:

Associate Principal, Appian Development

Chicago, Illinois, United States
Request Technology
solutions Embrace industry best practices like continuous integration, continuous deployment, automated testing, TDD etc Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld … required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI more »
Employment Type: Permanent
Salary: USD 150,000 Annual
Posted: