Security Control Assessor - Defense Research Analyst - Cyber Planner Software Development & System Engineering-Using programming languages Python/JAVA and open source and cloud-based software to include ELK Stack, Hadoop, AWS and Azure. - Enterprise Architect - Software Developer - Video Game Design and Development - Open Source Developer - Mobile Application Developer - Programming & Coding - Cloud Architect - Security and Network Architect Data Analytics & Big More ❯
DBMS, ORM (Hibernate), and APIs. Active TS/SCI Clearance with Full Scope Polygraph. Preferred Skills: Microservices, REST, JSON, XML. CI/CD, Docker, Kubernetes, Jenkins. Big Data technologies (Hadoop, Kafka, Spark). CISSP, Java, AWS certifications a plus. Ready to Transform Your Career? Join Trinity Enterprise Services-where professional growth meets personal fulfillment. Apply today to become a More ❯
Columbia, Maryland, United States Hybrid / WFH Options
SRC
of AWS services (EC2, EBS, S3, Lambda) Ability to develop in multiple programming languages such as Python, Bash, or Go Familiarity with Git Experience with big data technologies like Hadoop, Spark, PostgreSQL, or ElasticSearch Experience with containers (EKS), Diode, and CI/CD Active DoD 8140/8570 certification (Security+, etc) is required for consideration. Clearance Requirements An active More ❯
developing and performing ETL tasks in Linux and/or Cloud environments. Preferred Qualifications Demonstrated experience delivering solutions using Cloud technologies, such as AWS, Microsoft Azure, etc. Experience with Hadoop, Hbase, MapReduce. Experience with Elasticsearch. Experience working in a mission environment and/or with many different types of data. Company EEO Statement Accessibility/Accommodation: If because of More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Vantor
IaaS/PaaS services. D eveloping and deploying web services. W orking with open-source resources in a government computing environment Big data technologies such as Accumulo , Spark, Hive, Hadoop, ElasticSearch Strong Linux skills and familiarity with hybrid cloud/on-prem architecture. CompTIA Security+ or comparable certification for privileged user access. Experience a s a military/intelligence More ❯
Bachelor's, 6+years for Master's, Degree in Software Engineering, Computer Science, or related field. Proficiency in Python and Scala (or Java) Expertise in Big Data technologies such as Hadoop, Hive, or Spark Proven ability to productionize Machine Learning models, including feature engineering, scalability, validation, and deployment. Preferred Qualifications: Strong experience with cloud platforms such as AWS, EMR, Kubernetes More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Wyetech, LLC
Excellent oral and written communication skills. Desired Technical Skills Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Wyetech, LLC
any changes in the schedule. Desired Technical Skills & Certifications Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Wyetech, LLC
Excellent oral and written communication skills. Desired Technical Skills Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, Jenkins, etc.) Understanding of agile software development methodologies and use More ❯
Kubernetes, Podman, and Helm - Familiar developing and delivering in a Linux environment. - Experience with relational databases- PostgreSQL, MySQL or document databases- Elasticsearh, Solr or big data technologies such as Hadoop, Hive, or Spark. - Experience with Git an Git-based workflows Clearance requirement: TS/SCI w/FS Poly REQUIRED for consideration. Location: Chantilly, VA. 100% onsite Salaries are More ❯
understanding of web services, API, REST. Demonstrated experience working with Agile Scrum based development team DESIRED SKILLS: Familiar with Assessment & Authorization process and associated artifact collection process Familiarization with Hadoop distributed file systems (HDFS) Background utilizing Jira for documenting and tracking software development tasks WHAT YOU'LL NEED TO SUCCEED: Education: Bachelor's degree in Computer Science, Engineering, or More ❯
Experience with managing structured and unstructured data, and knowledge of binary file types like Parquet, Hudi, Iceberg, Delta Lake Experience integrating Oracle databases with big data platforms such as Hadoop, Spark, or cloud-based data lakes. Familiarity with Oracle SQL, MySQL, PostgreSQL, MongoDB, or Oracle Database Engineering Senior Oracle DBA with experience in Oracle Database Installations and Upgrades (version More ❯
experience integrating DBConnect Experience with Splunk Machine Learning Toolkit (MLTK), solid knowledge of RMF, Trellix ePO, NESSUS, SCAP, and vulnerability scanning is highly preferred Expert understanding in data analytics, Hadoop, MapReduce, visualization is a plus, programming experience PowerShell or Python Experience using ServiceNow ticketing system, broad operations or development experience = Strong organization, communication, and collaboration skills and be customer More ❯
agents or LLMs PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers More ❯
and problem diagnosis/resolution One of the following Cloud Developer Certifications are required: AWS Certified Developer-Associate AWS DevOps Engineer Professional Certified Kubernetes Application Developer (CKAD) Desired Qualifications: Hadoop/Cloud Developer Certification Familiarity with: AWS Cloud technologies Quarkus PrimeFaces GitLab CI/CD Agile methods and tools Ability to work in a team environment; good communication and More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
mindset is essential to adapt to schedule changes. Preferred Requirements Configuration management tools experience a plus (Puppet, Ansible, Chef, etc.) Experience with Big Data technologies is a huge plus (Hadoop, Kafka, Accumulo, Storm, Hortonworks, Cloudera). Experience with Cloud technologies a plus (AWS, Azure) Higher level clearance is desired, but not required. Can consider up to a TS/ More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
mindset is essential to adapt to schedule changes. Preferred Requirements Configuration management tools experience a plus (Puppet, Ansible, Chef, etc.) Experience with Big Data technologies is a huge plus (Hadoop, Kafka, Accumulo, Storm, Hortonworks, Cloudera). Experience with Cloud technologies a plus (AWS, Azure) Higher level clearance is desired, but not required. Can consider up to a TS/ More ❯
experience - Experience programming in Java, C++, Python or related language - Experience with neural deep learning methods and machine learning PREFERRED QUALIFICATIONS - Experience with large scale distributed systems such as Hadoop, Spark etc. - Experience in building machine learning models for business application Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Experience in foundation model training and fine-tuning, in-context learning, and AI agent Amazon is an equal opportunity employer and does not discriminate on the basis More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers More ❯
technical expertise in a specific engineering area, seeking knowledge from domain specialists when needed Technologies we use: Java, Kotlin, Scala, Dropwizard, Spring, Node.js, React, GraphQL, Docker, Kafka, Cassandra, MongoDB, Hadoop, Qubole, Spark, DataDog, Splunk, AWS cloud Experience and qualifications: Bachelor's or Master's degree in Computer Science, Software Engineering, or related field; or equivalent related professional experience 5+ More ❯
technologies into production systems. • Design and implement scalable cloud-based and on-premises data architectures using platforms like Azure, AWS, or Google Cloud. • Work with big data technologies (e.g., Hadoop, Spark) and data lake architectures to ensure the organization's data can be ingested, processed, and analyzed at scale. • Manage the integration of AI models and algorithms into big … AI/ML technologies (e.g., TensorFlow, PyTorch, Scikit-learn, Keras) and understanding of deep learning and natural language processing (NLP). • Strong understanding of big data platforms such as Hadoop, Apache Spark, and data lakes. • Hands-on experience with cloud platforms (Azure, AWS, or Google Cloud) for building scalable data solutions. • Proficiency in data modeling, data warehousing, and ETL More ❯
technologies into production systems. Design and implement scalable cloud-based and on-premises data architectures using platforms like Azure, AWS, or Google Cloud. Work with big data technologies (e.g., Hadoop, Spark) and data lake architectures to ensure the organization's data can be ingested, processed, and analyzed at scale. Manage the integration of AI models and algorithms into big … AI/ML technologies (e.g., TensorFlow, PyTorch, Scikit-learn, Keras) and understanding of deep learning and natural language processing (NLP). Strong understanding of big data platforms such as Hadoop, Apache Spark, and data lakes. Hands-on experience with cloud platforms (Azure, AWS, or Google Cloud) for building scalable data solutions. Proficiency in data modeling, data warehousing, and ETL More ❯
degree (Master's or Ph.D.) in a relevant technical or analytical field. Experience with FAA systems, aviation industry operations, or NextGen projects. Familiarity with big data technologies such as Hadoop or Spark. Expertise in simulation modeling platforms or advanced analytical tools. Familiarity with Simio, Arena, AnyLogic, or similar simulation platforms for modeling operational impacts and scenario planning. Knowledge of … Hadoop, Spark, or AWS/GCP big data ecosystems to handle and analyze large-scale aviation datasets. Experience applying machine learning algorithms, predictive analytics, or neural networks to aviation or transportation problems. Proficiency with AWS, Azure, or GCP for deploying analytical tools and managing data pipelines. SCA/Union/Intern Rate or Range Details Target Salary Range More ❯
into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Design and architect solutions with Big Data technologies (e.g Hadoop, Hive, Spark, Kafka) Design and implement systems that run at scale leveraging containerized deployments Design, build, and scale data pipelines across a variety of source systems and streams (internal … in Computer Science, Computer Engineering, Informatics, Information Systems, or another quantitative field Minimum 5 years of experience in a Data Engineer role Required Skills: Experience with big data tools: Hadoop, Spark, etc. Experience with relational SQL and NoSQL databases, including Postgres Experience with AWS cloud or remote services: EC2, EMR, RDS, Redshift Experience with stream-processing systems: Kafka, Storm More ❯