Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi More ❯
automation & configuration management Ansible (plus Puppet, Saltstack), Terraform, CloudFormation NodeJS, REACT/MaterialUI (plus Angular), Python, JavaScript Big data processing and analysis, e.g. ApacheHadoop (CDH), Apache Spark RedHat Enterprise Linux, CentOS, Debian or Ubuntu Java 8, Spring framework (preferably Spring boot), AMQP - RabbitMQ Open source technologies Experience of More ❯
other engineers on the team to elevate technology and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like More ❯
understanding of Java and its ecosystems, including experience with popular Java frameworks (e.g. Spring, Hibernate). Familiarity with big data technologies and tools (e.g. Hadoop, Spark, NoSQL databases). Strong experience with Java development, including design, implementation, and testing of large-scale systems. Experience working on public sector projects More ❯
understanding of Java and its ecosystems, including experience with popular Java frameworks (e.g. Spring, Hibernate). Familiarity with big data technologies and tools (e.g. Hadoop, Spark, NoSQL databases). Strong experience with Java development, including design, implementation, and testing of large-scale systems. Experience working on public sector projects More ❯
Degree Required technical and professional expertise Design, develop, and maintain Java-based applications for processing and analyzing large datasets, utilizing frameworks such as ApacheHadoop, Spark, and Kafka. Collaborate with cross-functional teams to define, design, and ship data-intensive features and services. Optimize existing data processing pipelines for … Information Technology, or a related field, or equivalent experience. Experience in Big Data Java development. In-depth knowledge of Big Data frameworks, such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing More ❯