Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
processing large data sets and offers experience in the following required task areas: experience with machine learning, statistical modeling, time-series forecasting, and/or geospatial analytics; experience with Hadoop, Spark, or other parallel storage/computing processes is a plus. Experience supporting cyber mission Overview Noblis and our wholly owned subsidiaries, Noblis ESI , and Noblis MSD tackle the More ❯
Statistics, Mathematics, Engineering, or CS field CPA, CGFM, or CDFM Certification Nice If You Have: Experience with modern relational databases, including MySQL or PostgreSQL, and Big Data systems, including Hadoop, HDFS, Hive, or Cloudera Experience providing recommendations with dashboards, visualizations, or reports using BI platforms such as QlikSense or Tableau Ability to manipulate and integrate databases with languages such More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
Experience blending analytical methodologies and leveraging existing COTS/GOTS/OS tools in an unconventional manner. Familiarity utilizing virtualization and distributed field systems, such as virtual environments and Hadoop (or similar distributed file systems) in development and deployment environments. Experience with Amazon Web Services (AWS/C2S). Familiarity with hardware platforms, e.g., CPUs, GPUs, FPGAs, etc. Familiarity More ❯
position will work alongside Black Lotus Labs advanced security researchers, data engineers, malware reverse engineers, data scientists, and our customers to tackle evolving threats accelerated by technologies like our Hadoop ecosystem (HBase, HDFS, Spark, Kafka, AirFlow), Elasticsearch and Redis clusters, Docker using Docker Swarm, malware environment, and a network of honeypots. This is a close-knit, experienced, amazingly smart More ❯
position will work alongside Black Lotus Labs advanced security researchers, data engineers, malware reverse engineers, data scientists, and our customers to tackle evolving threats accelerated by technologies like our Hadoop ecosystem (HBase, HDFS, Spark, Kafka, AirFlow), Elasticsearch and Redis clusters, Docker using Docker Swarm, malware environment, and a network of honeypots. This is a close-knit, experienced, amazingly smart More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Absolute Business Solutions Corp
or image processing; Experience blending analytical methodologies and leveraging existing COTS/GOTS/OS tools in an unconventional manner; Familiarity utilizing virtualization and distributed field systems, such as Hadoop (or similar distributed file systems) in development and deployment environments; Familiarity using git, svn, JIRA, or other version control technologies; Experience with Amazon Web Services (AWS/C2S); Familiarity More ❯
/Kubernetes. Familiarity with Git for software version control. Experience with Atlassian Tools (Jira, Confluence) What we'd like you to have Knowledge of map-reduce analytic environments (e.g., Hadoop). Experience with cloud-based deployment environments (e.g., AWS). Experience prototyping web applications (JavaScript). Knowledge of end-to-end SIGINT collection and analysis systems. Experience with production More ❯
need for data governance and metadata management skills. Ability to communicate effectively with diverse stakeholders, both technical and non-technical, across different seniority levels. Desired Skills & Certifications: Experience with Hadoop technologies and integrating them into ETL (Extract, Transform, Load) data pipelines. Familiarity with Apache Tika for metadata and text extraction from various document types. Experience as a Data Layer More ❯
page design in JavaScript, Angular, or Python Shiny Experience using Apache NiFi to perform data flow processing Experience using pig scripts to develop analytics against cloud data stored in Hadoop Distributed File System (HDFS) Knowledge of network protocols such as NMP or BGP Ability to work in an ambiguous, fast-paced, highly collaborative, and team-oriented environment and balance More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Codescratch LLC
tools, including deployment pipelines. Understanding of AGILE software development methodologies and use of standard software development tool suites. Preferred Skills and Experience: Experience with Docker and Kubernetes Experience with Hadoop Experience with Spark Experience with Accumulo Experience monitoring application performance with metrics (Prometheus, InfluxDB, Grafana) and logs with ELK Stack (ElsticSearch, Logstash, Kibana) Experience with asynchronous messaging systems (RabbitMQ More ❯
Deployment, Testing, and Monitoring practices. Java Skills: Expertise in Java for distributed systems, with a solid understanding of networking and multi-threading. Big Data Technologies: Experience working with ApacheHadoop, Accumulo, and NiFi. Agile Mindset: A strong believer in agile development methodologies. Linux & Scripting: Solid Linux fundamentals with experience in scripting languages like Python, Ruby, or Perl. Source Control More ❯
Stevenage, Hertfordshire, South East, United Kingdom
Anson Mccade
Neo4J, InfluxDB). Hands-on experience with data processing and integration tools , including ETL, ESB, and APIs. Proficiency in Python or similar programming languages. Exposure to big data technologies (Hadoop or similar frameworks). Familiarity with Generative AI, NLP, or OCR applications is highly desirable. Previous experience in the industrial or defence sector is advantageous. Salary & Working Model More ❯
is familiar with one or more of the following software/tools: Experience with monitoring technologies like ELK, Prometheus, Grafana, etc. Experience with building APIs and services using REST, Hadoop, Map Reduce, Spark, etc. Experience with build automation technologies like Maven, Jenkins, etc. Experience with Linux (preferred) or Windows operating systems. DESIRED QUALIFICATIONS Experience with the design and development More ❯
to adapt to schedule changes as needed. Preferred Requirements Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use More ❯
to adapt to schedule changes as needed. Preferred Requirements Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use More ❯
four-year university An active TS/SCI with polygraph You could also have this Experience using the Atlassian Tool Suite. Experience with development of any of the following; Hadoop, Pig, MapReduce, or HDFS Working knowledge with other object-oriented programming languages such as Java or C++ Working knowledge with Front-end data visualization libraries (i.e., D3.js; Raphael.js, etc. More ❯
/8570 compliance certifications (i.e. Security+) Preferred Requirements Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use of More ❯
required in this position as directed by the customer. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc.) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.) Understanding More ❯
Python (or equivalent) Experience using ML libraries, such as scikit-learn, Experience using data visualization tools Preferred Skills : Experience working with GPUs to develop model Experience with MapReduce programming (Hadoop) Skills with programming languages, such as Java or C/C++ Demonstrated ability to develop experimental and analytic plans for data modeling processes, use of strong baselines, ability to More ❯
et cetera. One Cloud Developer Certification is required: o AWS Certified Developer-Associate o AWS DevOps Engineer Professional o Certified Kubernetes Application Developer (CKAD) o Elastic Certified EngineerDesired Qualifications: Hadoop/Cloud Developer Certification Familiarity with: AWS Cloud technologies Quarkus PrimeFaces GitLab CI/CD Agile methods and tools Ability to work in a team environment; good communication and More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
to accommodate any changes in the schedule. Preferred Requirements Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Wyetech, LLC
Certifications A TS/SCI level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.) Understanding More ❯
attention to details, ensuring accuracy in documentation and data • Can handle working multiple projects at once with minimal guidance Desired Requirements: • Familiarity administering and/or using Apache, Spark, Hadoop, SOLR, elastic or Cloudera software • Giving briefings on their work and previous experience writing technical reports that involve a formal review process • Self-motivated, creative problem solver Hoplite Solutions More ❯
in multiple programming languages such as bash, Python, or Go Must have a DoD 8140/8570 compliance certification (i.e. Security+ certification) Preferred Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes More ❯
San Antonio, Texas, United States Hybrid / WFH Options
SRC
Python, Bash or Go Must have a DoD 8140/8570 compliance certification (i.e. Security+) You will wow us even more if you have: Experience with big data technologies-Hadoop, Spark Experience with databases such as MongoDB Experience with containers and Kubernetes US Citizenship is required and candidates must have an active TS/SCI or higher level government More ❯