real-time systems, and business management information systems) based upon documented requirements for the Data Transport System (DTS) • DTS products include but are not limited to: Cloud storage areas: Apache Accumulo (ApacheZookeeper, ApacheHadoop), Oracle DMBS Real time streaming: Storm Distributed in-memory data Cache/Storage: Redis, Graph Compute engine/Query Interface apache Tinkerpop …/Gremlin. Rules Engine: JBoss Drools, Apache Camel, Spring Framework: used extensively to standardize/simplify configuration logic control, data access, security, Web-tier, etc. Candidates will: o Analyze user requirements to derive software design and performance requirements o Debug existing software and correct defects o Design and code new software or modify existing software to add new features … DoD) and other federal partners • The DTS portfolio encompasses transport streams, messages and files with content size ranging from bytes to Terabytes • Candidates should have experience writing analytics using Apache Hadoop, HDFS, and MapReduce • Experience processing large data sets or high-volume data ingest is a plus • Experience monitoring, maintaining and troubleshooting Apache Accumulo, Apache Hadoop, and More ❯
tolerant, and real-time distributed system able to process millions of transactions daily. Design, develop, and release high-quality, scalable and maintainable code. Learn about open-source technologies like Apache Kafka, Apache Cassandra, ApacheZooKeeper and Docker. Get an up-and-close view of the global financial markets, while solving challenging real-world problems. Your work More ❯
development Relational Databases (Amazon RDS, PostgreSQL, MySQL, SQL, Percona) HAProxy MongoDB Java/Spring Boot troubleshooting (error logs) Kafka Message Broker Hazelcast Memory Cache NiFi maintenance and configuration Kubernetes Zookeeper Nginx Web Server Monitoring and Alerting Agile software development practices Jira and Confluence Salary Range: $150,000 - $220,000 The BlueHalo, an AV Company, pay range for this job More ❯
experience with RESTful APIs Experience developing and performing ETL tasks in a Linux environment Preferred Qualifications: Experience with Hadoop, Hbase, MapReduce Experience with Elasticsearch Experience with NiFi, Kafka, and Zookeeper Clearance Requirements: An active TS/SCI with Polygraph Physical Requirements: Use hands to operate a computer and other office productivity machinery, such as a calculator, copy machine and More ❯
Services, Microservices (Docker & Kubernetes), REST, XML, UML Development tools: Eclipse or NetBeans Deployment architectures and CI/CD pipelines using Docker, Git/JIRA, Kubernetes, Jenkins, Conductor, Kafka/Zookeeper, Consul, CMDB Big data technologies, including: Data ingest: JSON, Kafka, Microservices, Elasticsearch Analytics: Hive, Spark, R, Pig, Oozie workflows Hadoop ecosystem: Hive data, Oozie, Spark, Pig, Impala, Hue COTS … integration: Knowi, MongoDB, Oracle, MySQL RDS, Elastic, Logstash, Kibana, Zookeeper, Consul, Hadoop/HDFS Containerization/configuration tools: Docker, Chef North Point Technology is THE BEST place to work for curious-minded engineers motivated to support our country's most crucial missions! We focus on long term projects, leveraging the latest technology in support of innovative solutions to solve More ❯
Services, Microservices (Docker & Kubernetes), REST, XML, UML Development tools: Eclipse or NetBeans Deployment architectures and CI/CD pipelines using Docker, Git/JIRA, Kubernetes, Jenkins, Conductor, Kafka/Zookeeper, Consul, CMDB Big data technologies, including: Data ingest: JSON, Kafka, Microservices, Elasticsearch Analytics: Hive, Spark, R, Pig, Oozie workflows Hadoop ecosystem: Hive data, Oozie, Spark, Pig, Impala, Hue COTS … integration: Knowi, MongoDB, Oracle, MySQL RDS, Elastic, Logstash, Kibana, Zookeeper, Consul, Hadoop/HDFS Containerization/configuration tools: Docker, Chef North Point Technology is THE BEST place to work for curious-minded engineers motivated to support our country's most crucial missions! We focus on long term projects, leveraging the latest technology in support of innovative solutions to solve More ❯
Webservices, Microservices leveraging Docker & Kubernetes, REST, XML, UML, Eclipse or Netbeans, Deployment architectures including Continuous Integration Pipeline build concepts leveraging: Docker, Git/JIRA, Kubernetes, Jenkins, Conductor, Kafka/Zookeeper, Consul, CMDB. Big data technologies to include: Data Ingest (JSON, Kafka, Microservices, Elastic Search), Analytics (HIVE, SPARK, R, PIG, OOZIE workflows), Elasticsearch, Hadoop (HIVE data, OOZIE, Spark, PIG, IMPALA … HUE), COTS Integration (Knowi, MongoDB, Oracle, MySQL RDS, Elastic, Logstash, Kibana, Zookeeper, Consul, HADOOP/HDFS), Docker and Chef Who we are: Reinventing Geospatial, Inc. (RGi) is a fast-paced small business that has the environment and culture of a start-up, with the stability and benefits of a well-established firm. We solve complex problems within geospatial software More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of … the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies. Aspect Oriented Design and Development. Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications. UNIX/LINUX, CentOS More ❯
or similar development environment, MAVEN, RESTful web services. 17. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. 18. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several … of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB 19. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies 20. Aspect Oriented Design and Development 21. Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications More ❯
II certification (e.g. CompTIA Security+) Experience with cloud service providers (CSPs) (e.g. AWS, Azure) Git/GitLab experience Experience with programming (scripting) languages (i.e. Bash, Python, .Net or Java) Apache open-source products (i.e. NiFi, Kafka, Zookeeper) Security+ Experience with Virtual Machines and Type 1 hypervisors, e.g. KVM, ESXi Demonstrated experience with Linux (e.g. Red Hat Enterprise Linux More ❯
Agile team environment with little to no supervision. Successful candidates embody a passion for continuous improvement and innovation. Requirements Active TS/SCI with Full Scope Poly. Experienced with Apache NiFi and IC equivalent. General understanding and ability to work with zookeeper. Expert level experience developing data ETL processes. Able to deploy and maintain NiFi clusters. 5+ years' experience … Python, and Groovy. Experienced using EC2 instances in AWS. 2-5 years of experience working with Ansible, developing playbooks and roles for software driven deployment of NiFi clusters and zookeeper clusters. Understanding of Git and Gitlab. 2-5 years' experience using REST APIs to call RESTful services. Familiar with Linux - particularly the new Rocky variant. Working level familiarity with More ❯