Fairfax, Virginia, United States Hybrid / WFH Options
CGI
parental leave Learning opportunities and tuition assistance Wellness and Well-being programs Due to the nature of this government contract, US Citizenship is required. Skills: Adobe Spark Hadoop Ecosystem (HDFS) What you can expect from us: Together, as owners, lets turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, youll reach your More ❯
years of experience applying machine learning techniques and the key parameters that affect their performance 2+ years of experience with Big Data programming technologies, including HadoopDistributedFileSystem ( HDFS ) , Apache Spark, or Apache Flink 2+ years of experience working with a wide range of predictive and decision models and tools for developing models Experience in natural language processing topics More ❯
experience Preferred Qualifications: Bachelor's degree in related field preferred Windows 7/10, MS Project Apache Airflow Python, Java, JavaScript, React, Flask, HTML, CSS, SQL, R, Docker, Kubernetes, HDFS, Postgres, Linux AutoCAD JIRA, Gitlab, Confluence About Us: IntelliBridge delivers IT strategy, cloud, cybersecurity, application, data and analytics, enterprise IT, intelligence analysis, and mission operation support services to accelerate technical More ❯
related field preferred Active TS/SCI Required Preferred Qualifications: Windows 7/10, MS Project Apache Airflow Python, Java, JavaScript, React, Flask, HTML, CSS, SQL, R, Docker, Kubernetes, HDFS, Postgres, Linux AutoCAD JIRA, Gitlab, Confluence Also looking for a Senior Developer at a higher compensation More ❯
have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the HadoopDistributedFileSystem (HDFS); Shall have demonstrated work experience with Serialization such as JSON and/or BSON More ❯
SpringBoot, ExtJS, AngularJS, Ansible, Swagger, Git, Subversion, Maven, Jenkins, Gradle, Nexus, Eclipse, IntelliJ, Ext-Js, JQuery, and D3. Cloud technologies: Pig, Hive, Apache Spark, Azure DataBricks, Storm, HBase, HadoopDistributedFileSystem, and MapReduce Open-source virtual machines and Cloud-based This position is contingent on funding and may not be filled immediately. However, this position is representative of positions More ❯
Description- Spark - Must have Scala - Must Have Hive/SQL - Must Have Job Description : Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies , real time data processing platform(Spark Streaming) experience would be an advantage. • Consistently demonstrates More ❯
Description- Spark - Must have Scala - Must Have Hive/SQL - Must Have Job Description : Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies , real time data processing platform(Spark Streaming) experience would be an advantage. • Consistently demonstrates More ❯
active TS/SCI with polygraph You could also have this Experience using the Atlassian Tool Suite. Experience with development of any of the following; Hadoop, Pig, MapReduce, or HDFS Working knowledge with other object-oriented programming languages such as Java or C++ Working knowledge with Front-end data visualization libraries (i.e., D3.js; Raphael.js, etc.) Salary Range More ❯
features. Building the required features as a member of an agile feature team. Helping maintain code quality via code reviews. Skill Requirements Proficiency in administrating Big Data technologies (Hadoop, HDFS, Spark, Hive, Yarn, Oozie, Kafka, HBase, Apache stack). Proficiency in defining highly scalable platform architecture. Knowledge of architectural design patterns, highly optimized, low latency, and massively scalable platforms. Hybrid More ❯
Table. Shall have demonstrated work experience with the Map Reduce programming model and technologiessuch as Hadoop, Hive, Pig. Shall have demonstrated work experience with the HadoopDistributedFileSystem (HDFS). Shall have demonstrated work experience with Serialization such as JSON and/or BSON. Shall have demonstrated work experience developing Restful services.8.Shall have at least three (3) years' experience More ❯
evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution. You will work on a software development program proving software development engineering strategies for environments using HadoopDistributedFileSystem (HDFS), Map Reduce, and other related cloud technologies. You will provide set-up, configuration, and software installation for development, test, and production systems. Interface directly with development team as well as … Big Table; Convert existing algorithms or develop new algorithms to utilize the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig; Support operational systems utilizing the HDFS; Support the deployment of operational systems and applications in a cloud environment; Conduct scalability assessments of Cloud-related algorithms, applications, and systems to identify performance bottlenecks and areas for improvement More ❯
other federal partners • The DTS portfolio encompasses transport streams, messages and files with content size ranging from bytes to Terabytes • Candidates should have experience writing analytics using Apache Hadoop, HDFS, and MapReduce • Experience processing large data sets or high-volume data ingest is a plus • Experience monitoring, maintaining and troubleshooting Apache Accumulo, Apache Hadoop, and Apache Zookeeper deployments is required More ❯
Experience supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with Apache Hadoop and the HadoopDistributedFileSystem Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence Travel Security Clearance Top Secret/SCI/CI Poly More ❯
with Oracle Dataguard. Familiar with Ansible. Familiar with Oracle Label Security and Oracle Identity Management. Basic understanding of UNIX system administration. Familiar with other RDBMS and noSQL platforms (e.g., HDFS, Postgres, MongoDB, AllegroGraph, NoSQL, RDF, and/or SPARQL). Familiar with Jira and GitLab. Full-Time Employee Compensation M9 Solutions' pay range for this position is a general guideline More ❯
operating systems 3+ years demonstrated experience with administration of Linux based environments Full-stack development experience in Python Kubernetes or Nomad orchestration Experience with Data driven tools like Hadoop, HDFS, ElasticSearch, YARN Experience with the cyber security accreditation process (RMF, ATO process, NIST 800-53 cyber security controls) Experience with CI/CD tools like Sonarqube, Fortify, or similar Experience More ❯
Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc. o Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc. o HadoopDistributedFileSystem (HDFS) o Serialization such as JSON and/or BSON • 4 years of SWE experience may be substituted for a bachelor's degree. • TS/SCI Clearance Required Salary between More ❯
API, REST. Demonstrated experience working with Agile Scrum based development team DESIRED SKILLS: Familiar with Assessment & Authorization process and associated artifact collection process Familiarization with Hadoopdistributedfile systems (HDFS) Background utilizing Jira for documenting and tracking software development tasks WHAT YOU'LL NEED TO SUCCEED: Education: Bachelor's degree in Computer Science, Engineering, or a related technical discipline, or More ❯
Experience supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with Apache Hadoop and the HadoopDistributedFileSystem Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence More ❯
storage, retrieval, backup, and retention strategies. Experience with clustered or cloud storage deployments. Strong cross-team collaboration and user interaction skills. Expertise in secure, compliant solution planning. Knowledge of HDFS, Hadoop, HBase/Accumulo, and Big Table internals.Join Peraton and play a key role in securing mission-critical systems with cutting-edge solutions. Are you ready to make an impact More ❯
compliance • Collaborate within cross-functional Integrated Product Teams (IPTs) to drive system integration and ensure mission success • Research and implement distributed storage, routing and querying algorithms, leveraging technologies like HDFS, Hadoop, HBase, and Accumulo (BigTable) Desired Qualifications: • Strong background in network systems engineering, with a clear understanding of data routing, security, and optimization • Experience in the integration of COTS and More ❯