real-time systems, and business management information systems) based upon documented requirements for the Data Transport System (DTS) • DTS products include but are not limited to: Cloud storage areas: ApacheAccumulo (Apache Zookeeper, ApacheHadoop), Oracle DMBS Real time streaming: Storm Distributed in-memory data Cache/Storage: Redis, Graph Compute engine/Query Interface apache Tinkerpop …/Gremlin. Rules Engine: JBoss Drools, Apache Camel, Spring Framework: used extensively to standardize/simplify configuration logic control, data access, security, Web-tier, etc. Candidates will: o Analyze user requirements to derive software design and performance requirements o Debug existing software and correct defects o Design and code new software or modify existing software to add new features … DoD) and other federal partners • The DTS portfolio encompasses transport streams, messages and files with content size ranging from bytes to Terabytes • Candidates should have experience writing analytics using Apache Hadoop, HDFS, and MapReduce • Experience processing large data sets or high-volume data ingest is a plus • Experience monitoring, maintaining and troubleshooting ApacheAccumulo, Apache Hadoop More ❯
languages, frameworks, and tools to support a broad range of applications Work with database technologies such as PostgreSQL, Redis, MySQL, and others Aid algorithm and data pipeline development in Apache Nifi Work Environment: Location: Colorado Springs, CO - 100% Onsite Travel Requirements: Minimal Working Hours: Standard Qualifications: Required: Security Clearance: Must have an active U.S. government Top Secret/SCI … which is something only a U.S. citizen can obtain Education: Bachelor's Degree in computer, information systems or related field Advanced proficiency in SQL and NoSQL databases Experience with ApacheAccumulo, Apache Hadoop Experience with Python Experience with Docker, AWS and/or Azure Hands-on experience with Apache Kafka, Apache NiFi Experience developing data … as xml, and protocol buffers (protobuf) Ability to work independently to research and solve customer pain points Desired: Master's Degree in Computer Science or related field Experience with Apache Artimis, ActiveMQ, or other IoT message brokers Working knowledge of MIL-STD-6016 and MIL-STD-3011 High level of curiosity and investigative mindset with an attention to detail More ❯
as Java, C, C++ for distributed systems, with proficiency in networking, multi-threading and implementation of REST APIs. Experience with the Spring framework, messaging frameworks (Kafka, RabbitMQ), streaming analytics (Apache Flink, Spark), management of containerized applications (Kubernetes). Experience with Enabling tools (Git, Maven, Jira), DevOps (Bamboo, Jenkins, GitLab Cl/Pipelines), Continuous Monitoring (ELK Stack (ElasticSearch, Logstash and … Kibana), Nagios). Experience with Apache Hadoop, ApacheAccumulo and Apache NiFi. Well-grounded in Linux fundamentals and familiarity with scripting languages (e.g., Python, Ruby, Perl, BASH, etc.). Experience with AWS Cloud architecture, Infrastructure as Code (IaC), Cloud security, and Automation (AWS Lambda, CloudFormation).Benefits: Peraton offers enhanced benefits to employees working on this critical More ❯
Software Engineer 3 to perform, amongst the Compute Team, within a massively parallel enterprise platform, built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable execution of data-intensive analytics on a managed infrastructure. The selected candidate will be a self-motivated Java developer who proactively completes tasks with a strong attention … and will be exposed to a variety of technologies depending on customer requirements. Required Skills: s Java programming for distributed systems, with experience in networking and multi-threading s Apache Hadoop s ApacheAccumulo s Apache NiFi s Agile development experience s Well-grounded in Linux fundamentals and knowledge in at least one scripting language (e.g. More ❯
Trip to either Nassau Bahamas, Singer Island Florida, Paradise Island Bahamas, or the Cambridge Hyatt Resort Desired Skills: • Proficient in Java • Comfortable working in a Linux environment • Experience with Apache Open Source Hadoop, Apache Open Source Accumulo, Apache Open Source NiFi • Familiarity with Context chaining and Graph theory • Experience with Containerization - Docker, Kubernetes • Experience with Enabling More ❯
We'd Like to See: Languages/Scripting: JavaScript Application deployment: Ansible OGC Web Services: WMS, WMTS, WCS, WFS Frameworks: AI/ML Data: PySpark, Elasticsearch, Kibana Other Skills: Apache NiFi, ApacheAccumulo, High performance computing cluster Certifications: Sec+ Who we are: Reinventing Geospatial, Inc. (RGi) is a fast-paced small business that has the environment and More ❯
Boot Experience with processing Big Data Demonstrated experience with system design and architecture Experience with Web development, HTTP, and REST services Experience with NoSQL technologies such as Elasticsearch and Accumulo Experience with CI/CD principles, concepts, best practices and tools such as Jenkins and GitLab CI Position Desired Skills: Experience with the Atlassian Tool suite (Jira, Confluence) Experience More ❯
Boot Experience with processing Big Data Demonstrated experience with system design and architecture Experience with Web development, HTTP, and REST services Experience with NoSQL technologies such as Elasticsearch and Accumulo Experience with CI/CD principles, concepts, best practices and tools such as Jenkins and GitLab CI Position Desired Skills Experience with the Atlassian Tool suite (Jira, Confluence) Experience More ❯
Pig is highly desired • Experience with Data Science • Experience with Graphic Algorithms • Experience with Machine Learning • Experience with AWS • Cloud development experience such as Hadoop, Big Data (Cloudbase/Accumulo and Big Table) as well as JSON/BSON • Experience with analytic development • Experience with Python and streaming capabilities • Experience with Software Configuration Management Tools such as JIRA, GIT More ❯
US citizenship and an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity More ❯
US citizenship and an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity More ❯
US citizenship and an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
during the interview process. Place of Performance: either Columbia, MD or San Antonio, TX. Both positions would involve Hybrid work. Desired Skills (Optional) Experience with NOSQL databases such as Accumulo desired CI Poly preferred Experience developing with Kubernetes environments Prior Experience supporting cyber and/or network security operations within a large enterprise, as either an analyst, engineer, architect More ❯
the position Computer Science (CS) degree or related field Experience with Java, Python, C, and Query Time Analytics (QTA) Customer GHOSTMACHINE analytic development Experience with Hadoop (Map Reduce, and Accumulo) Experience with Linux Experience with GEOINT Desired Skills: Familiarity with JIRA and Confluence. Understanding of customer analytical tools' Compensation Range: $198,094.13 - $223,094.13 _ Compensation ranges encompass a total More ❯
Role Looking for a Cloud System Administrator to work in a cloud platform environment, to built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable the execution of data-intensive analytics on a managed infrastructure. This position is on the Operations Team that ensures day-to-day operations stability, provides customer support More ❯
The Role Looking for a Reliability Engineer to work in a cloud platform environment, to built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable the execution of data-intensive analytics on a managed infrastructure. This position is on the Operations Team that ensures day-to-day operations stability, provides customer support More ❯
and DevSecOps practices. Proficient in Postgres design/optimization, distributed processing (REST APIs, microservices, IaaS/PaaS), and developing/deploying web services. Experience with big data technologies, including Accumulo, Spark, Hive, Hadoop, and ElasticSearch. Experience working with open-source resources in government computing environments; military/intelligence analyst experience or familiarity with IC PED systems is a plus. More ❯
and DevSecOps practices. Proficient in Postgres design/optimization, distributed processing (REST APIs, microservices, IaaS/PaaS), and developing/deploying web services. Experience with big data technologies, including Accumulo, Spark, Hive, Hadoop, and ElasticSearch. Experience working with open-source resources in government computing environments; military/intelligence analyst experience or familiarity with IC PED systems is a plus. More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Wyetech, LLC
Understanding of AGILE software development methodologies and use of standard software development tool suites Desired Technical Skills Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly require More ❯
Integration, Deployment, Testing, and Monitoring practices. Java Skills: Expertise in Java for distributed systems, with a solid understanding of networking and multi-threading. Big Data Technologies: Experience working with Apache Hadoop, Accumulo, and NiFi. Agile Mindset: A strong believer in agile development methodologies. Linux & Scripting: Solid Linux fundamentals with experience in scripting languages like Python, Ruby, or Perl. More ❯
Polygraph Clearance Required Qualifications Experience building distributed systems. Experience performing application, network, and infrastructure monitoring and analysis. Familiarity with open source tools such as Istio, Keycloak, Nginx, Prometheus, Grafana, Accumulo, and Elasticsearch. Experience with administering Kubernetes clusters including deploying and configuring operators and helm charts. Experience with one or more of the following programming languages: Go, Java, Javascript, Kotlin More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
and ability to build interactive, insightful dashboards for monitoring ML models. Place of Performance: Hybrid work in San Antonio, TX. Desired Skills (Optional) Experience with NOSQL databases such as Accumulo is a plus Familiarity with deploying models as API's or within containerized environments (e.g., Docker, Kubernetes) to serve and monitor models in production. Experience with Large Language Models More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly require More ❯
the schedule. Must be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Compensation At IAMUS Consulting More ❯
software to correct defects, adapt to new hardware, or improve performance, ensuring integration with Hadoop Distributed File System (HDFS) environments and distributed Big Data stores (e.g., HBase, CloudBase/Accumulo, Big Table). Must have the ability to develop simple data queries and complex database or data repository interfaces for existing and proposed databases, utilizing serialization formats such as More ❯