management information systems) based upon documented requirements for the Data Transport System (DTS) • DTS products include but are not limited to: Cloud storage areas: ApacheAccumulo (Apache Zookeeper, ApacheHadoop), Oracle DMBS Real time streaming: Storm Distributed in-memory data Cache/Storage: Redis, Graph Compute engine/… Query Interface apache Tinkerpop/Gremlin. Rules Engine: JBoss Drools, Apache Camel, Spring Framework: used extensively to standardize/simplify configuration logic control, data access, security, Web-tier, etc. Candidates will: o Analyze user requirements to derive software design and performance requirements o Debug existing software and correct … The DTS portfolio encompasses transport streams, messages and files with content size ranging from bytes to Terabytes • Candidates should have experience writing analytics using Apache Hadoop, HDFS, and MapReduce • Experience processing large data sets or high-volume data ingest is a plus • Experience monitoring, maintaining and troubleshooting ApacheMore ❯
Qualifications: Willingness to be a committer/contributor to open source applications Java programming for distributed systems, with experience in networking and multi-threading. Apache Hadoop ApacheAccumuloApache NiFi Agile development experience Well-grounded in Linux fundamentals and knowledge in at least one scripting language More ❯
speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, ApacheAccumulo, Apache Spark, Spring, Apache NiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning More ❯
last 5 years, a minimum of 3 years experience with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as hbase, apacheaccumulo, and/or big table Within the last 3 years, a minimum of 1 year experience with requirements analysis and design for … Demonstrated experience designing and developing automated analytic software, techniques, and algorithms Demonstrated experience with compute cluster monitoring tools e.g. nagios, ganglia Demonstrated experience with apacheaccumulo internals, to include configuring, tuning, and testing the same in many configurations (>200 node clusters) Demonstrated experience with big-data cloud scalability More ❯
distributed systems, with proficiency in networking, multi-threading and implementation of REST APIs Experience with the Spring framework, messaging frameworks (Kafka, RabbitMQ), streaming analytics (Apache Flink, Spark), management of containerized applications (Kubernetes). Experience with Enabling tools (Git, Maven, Jira), DevOps (Bamboo, Jenkins, GitLab Cl/Pipelines), Continuous Monitoring … ELK Stack (ElasticSearch, Logstash and Kibana), Nagios) Experience with Apache Hadoop, ApacheAccumulo and Apache NiFi Well-grounded in Linux fundamentals and familiarity with scripting languages (e.g., Python, Ruby, Perl, BASH, etc.) Experience with AWS Cloud architecture, Infrastructure as Code (IaC), Cloud security, and Automation (AWS More ❯
amongst the Compute Team, within a massively parallel enterprise platform, built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable execution of data-intensive analytics on a managed infrastructure. The selected candidate will be a self-motivated Java developer who proactively completes … a variety of technologies depending on customer requirements. Required Skills: s Java programming for distributed systems, with experience in networking and multi-threading s Apache Hadoop s ApacheAccumulo s Apache NiFi s Agile development experience s Well-grounded in Linux fundamentals and knowledge in at More ❯
Singer Island Florida, Paradise Island Bahamas, or the Cambridge Hyatt Resort Desired Skills: • Proficient in Java • Comfortable working in a Linux environment • Experience with Apache Open Source Hadoop, Apache Open Source Accumulo, Apache Open Source NiFi • Familiarity with Context chaining and Graph theory • Experience with Containerization More ❯
GitLab, or BitBucket. Experience with drafting and maintenance of technical documentation Experience with RDBMS (ex: Oracle, Postgres, MySQL) and non-SQL DBs (ex: HBase, Accumulo, MongoDB, Neo4J) methodologies Familiarity with Cloud technologies such as Azure, AWS, or GCP. Experience working in a classified environment on operational networks such as More ❯
with Data Science • Experience with Graphic Algorithms • Experience with Machine Learning • Experience with AWS • Cloud development experience such as Hadoop, Big Data (Cloudbase/Accumulo and Big Table) as well as JSON/BSON • Experience with analytic development • Experience with Python and streaming capabilities • Experience with Software Configuration Management More ❯
a minimum of three (3) years experience with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, CloudBase/Accumulo, and/or Big Table Demonstrated experience developing Restful services What we'd like you to have Demonstrated experience with Ruby-on-Rails framework More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
deploying web services working with open-source resources in a government computing environment Maintaining backend GIS technologies ICD 503 Big data technologies such as Accumulo , Spark, Hive, Hadoop , or ElasticSearch F amiliarity with : hybrid cloud/on-prem architecture, AWS, C2S, and OpenStack . concepts such as Data visualization More ❯
cycle. 1 year minimum working with Back End software development including hands on experience in one or more of the following: Java, PostgreSQL, MongoDB, Accumulo, Go, Python, Elastic Search. 1 year minimum working with Front End software development including hands on experience in one or more of the following More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
of Performance: either Columbia, MD or San Antonio, TX. Both positions would involve Hybrid work. Desired Skills (Optional) Experience with NOSQL databases such as Accumulo desired CI Poly preferred Experience developing with Kubernetes environments Prior Experience supporting cyber and/or network security operations within a large enterprise, as More ❯
deploy, and document big data cloud computing workflows TS/SCI clearance with polygraph Experience with Java and Pig Experience with Hadoop (Map Reduce, Accumulo, & Distributed File System (HDFS Experience with Linux What you need to have Bachelor's Degree and 5 to 8 years of experience; Master's More ❯
GitLab, or BitBucket. Experience with drafting and maintenance of technical documentation. Experience with RDBMS (ex: Oracle, Postgres, MySQL) and non-SQL DBs (ex: HBase, Accumulo, MongoDB, Neo4J) methodologies Familiarity with Cloud technologies such as Azure, AWS, or GCP. Experience working in a classified environment on operational networks such as More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies
deploying web services working with open-source resources in a government computing environment Maintaining backend GIS technologies ICD 503 Big data technologies such as Accumulo , Spark, Hive, Hadoop , or ElasticSearch F amiliarity with : hybrid cloud/on-prem architecture, AWS, C2S, and OpenStack . concepts such as Data visualization More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Metronome LLC
office along with some work on customer site in Aberdeen Proving Ground, MD. Flexibility is key. Desired Skills Experience with NOSQL databases such as Accumulo desired OR experience querying and optimizing SQL databases. Prior Experience supporting cyber and/or network security operations within a large enterprise, as either More ❯
extraction, transformation, and reporting Experience with data warehousing concepts and platforms, such as Snowflake and Amazon Redshift, and with databases such as Postgres, Solr, Accumulo, or Iceberg Experience integrating structured and unstructured data from various sources such as APIs, databases, or flat files, and with web services and communication More ❯
deploying web services working with open-source resources in a government computing environment Maintaining backend GIS technologies ICD 503 Big data technologies such as Accumulo , Spark, Hive, Hadoop , or ElasticSearch F amiliarity with : hybrid cloud/on-prem architecture, AWS, C2S, and OpenStack . concepts such as Data visualization More ❯
building distributed systems. Experience performing application, network, and infrastructure monitoring and analysis. Familiarity with open source tools such as Istio, Keycloak, Nginx, Prometheus, Grafana, Accumulo, and Elasticsearch. Experience with administering Kubernetes clusters including deploying and configuring operators and helm charts. Experience with one or more of the following programming More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
insightful dashboards for monitoring ML models. Place of Performance: Hybrid work in San Antonio, TX. Desired Skills (Optional) Experience with NOSQL databases such as Accumulo is a plus Familiarity with deploying models as API's or within containerized environments (e.g., Docker, Kubernetes) to serve and monitor models in production. More ❯
platforms and services (e.g., AWS, Azure, GCP). Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Experience with Python, Java, Kafka Topics, Accumulo, Javascript. Experience with ArgGIS services a plus. Experience with presentation layer optimization. Experience Map Servers, BaseMaps, Mapping tools. Experience with product information management (PIM More ❯
platforms and services (e.g., AWS, Azure, GCP). Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Experience with Python, Java, Kafka Topics, Accumulo, Javascript. Experience with ArgGIS services a plus. Experience with presentation layer optimization. Experience Map Servers, BaseMaps, Mapping tools. Product development lifecycle and track management More ❯
TS/SCI security clearance with a current polygraph is required. Preferred Qualifications: Demonstrated work experience with OpenSource (NoSQL) products such as Hbase/Accumulo, Big Table, etc. A minimum of six (6) years demonstrated experience out of the most recent eight (8) years developing production software for Solaris More ❯
deploying web services working with open-source resources in a government computing environment Maintaining backend GIS technologies ICD 503 Big data technologies such as Accumulo , Spark, Hive, Hadoop , or ElasticSearch F amiliarity with : hybrid cloud/on-prem architecture, AWS, C2S, and OpenStack . concepts such as Data visualization More ❯