Excellent oral and written communication skills. Understanding of AGILE software development methodologies and use of standard software development tool suites. Desired Technical Skills Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. The Benefits Package More ❯
hands-on experience in programming and software development using Java, JavaScript, or Python. Demonstrated hands on experience working with PostgreSQL and Apache Cassandra. Demonstrated hands-on experience working with Hadoop, Apache Spark and their related ecosystems. Salary Range: $175,000-$200,000 Equal Opportunity Employer/Individuals with Disabilities/Protected Veterans More ❯
What You'll Bring • 6 to 10 years' IT Architecture experience working in a software development, technical project management, digital delivery, or technology consulting environment • Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others) • Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery-based environment • Experience with at More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯
AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a month More ❯
AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a month More ❯
Familiarity with and experience of using UNIX Knowledge of CI toolsets Good client facing skills and problem solving aptitude DevOps knowledge of SQL Oracle DB Postgres ActiveMQ Zabbix Ambari Hadoop Jira Confluence BitBucket ActiviBPM Oracle SOA Azure SQLServer IIS AWS Grafana Oracle BPM Jenkins Puppet CI and other cloud technologies. Your Skills: Excellent attention to detail and ability to More ❯
City of London, London, South Bank, United Kingdom
Experis
Familiarity with and experience of using UNIX Knowledge of CI toolsets Good client facing skills and problem solving aptitude DevOps knowledge of SQL Oracle DB Postgres ActiveMQ Zabbix Ambari Hadoop Jira Confluence BitBucket ActiviBPM Oracle SOA Azure SQLServer IIS AWS Grafana Oracle BPM Jenkins Puppet CI and other cloud technologies. Your Skills: Excellent attention to detail and ability to More ❯
PyTorch, scikit-learn). Experience with cloud-based AI platforms (e.g., AWS SageMaker, Google Cloud AI Platform, Azure Machine Learning). Experience with data management and processing tools (e.g., Hadoop, Spark, SQL). Proficiency in programming languages such as Python and Java. Experience with DevOps practices and tools (e.g., CI/CD, containerization). Desired Qualifications: Experience with machine More ❯
indexing, search platforms, GPU workloads, and distributed storage (e.g., Cloudera). Experience developing algorithms with R, Python, SQL, or NoSQL. Knowledge of distributed data and computing tools such as Hadoop, Hive, Spark, MapReduce, or EMR. Hands-on experience with visualization tools like Plotly, Seaborn, or ggplot2. Security+ certification. More ❯
Columbia, Maryland, United States Hybrid / WFH Options
SilverEdge
have a DoD 8140/8570 compliance certification (i.e. Security+ certification) Must have a Computing Environment certification (AWS, Linux, Kubernetes, etc.) Desired Qualifications Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus About SilverEdge SilverEdge Government Solutions was founded on the belief that nurturing More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Quantech Services, Inc
have some ability to be working from home time to time. Flexibility is essential to accommodate any changes in the schedule. Preferred Requirements: Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus More ❯
and tools: RDF, OWL, SHACL, Protégé. Experience with graph databases and query languages: Neo4j, JanusGraph, SPARQL, Cypher. Hands-on experience with ETL processes and distributed data/computing tools (Hadoop, Spark, Hive, NiFi). Strong skills in data visualization (Tableau, Power BI, Matplotlib, Seaborn, ggplot). Excellent written and verbal communication for coordination across Army, joint, and interagency teams. More ❯
deploying Qlik applications on the Advana platform Experience in the development of algorithms leveraging R, Python, or SQL and NoSQL Experience with Distributed data and computing tools, including MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL Experience with cloud applications such as AWS or Databricks Experience with visualization packages, including Plotly, Seaborn, or ggplot2 Clearance: Applicants selected will More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Polaris Consulting Group, Inc
with occasional opportunities to be working from home. Flexibility is essential to accommodate schedule changes based on customer and team needs. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus Work could possibly require some on-call work. Why Join Polaris? Mission More ❯
customer site in Aberdeen Proving Ground, MD 5 days a week. Flexibility is essential to adapt to schedule changes if needed. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could possibly require some on-call More ❯
CD, Ansible, Terraform, Cloudformation Experience with AWS cloud services: e.g. EC2, RDS, Redshift Even Better: Experience working with large datasets and familiarity with big data infrastructure, such as AWS, Hadoop, Spark, Dask, or MapReduce Experienced with data pipelines, data warehouses, data lakes, and relational databases Experience working in a large cross-functional team Working knowledge of DISA STIGs, vulnerability More ❯
Aberdeen Proving Ground, Maryland, United States Hybrid / WFH Options
Polaris Consulting Group, Inc
average 2-3 days per week in Aberdeen Proving Ground, MD. Flexibility is essential to adapt to schedule changes if needed. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could possibly require some on-call More ❯
On average 1-2 days per week in Columbia, MD. Flexibility is key to accommodate any schedules changes per the customer. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus Work could possibly require some on-call work. Why Join Polaris? Mission More ❯
database technologies (PostgreSQL, MySQL, RDS) US citizenship and an active TS/SCI with Full Scope Polygraph security clearance required Desired Experience: Experience with distributed databases and streaming tools (Hadoop, Spark, Yarn, Hive, Trino) Experience with Remote Desktop Protocol (RDP) technologies Experience with data access control, specifically Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) Familiarity More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Elder Research, Inc
skills both verbal and written. Preferred Experience and Skills: Expertise with Qlik. Experience with machine learning, with statistical modeling, time-series forecasting and/or geospatial analytics. Experience with Hadoop, Spark or other parallel storage/computing processes. Why apply to this position at Elder Research? Competitive Salary and Benefits Important Work/Make a Difference: supporting Customs and More ❯
search, GPU workloads, and distributed storage, including Cloudera Experience in the development of algorithms leveraging R, Python, SQL, or NoSQL Experience with Distributed data or computing tools, including MapReduce, Hadoop, Hive, EMR, Spark, Gurobi, or MySQL Experience with visualization packages, including Plotly, Seaborn, or ggplot2 Security+ Certification Clearance: Applicants selected will be subject to a security investigation and may More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
contributing within the Risk Management Framework (RMF) process Proficiency in system design and meticulous documentation Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs. Experience in RESTful web services Experience in Microservices architecture Experience More ❯