or GitHub Experience scaling data engineering across distributed computing clusters, including Apache Spark, Nifi, Dask, Airflow, or Luigi Experience with SQL and NoSQL database technologies such as Elasticsearch, Solr, HBase, Accumulo, Cassandra, Weaviate, ChromaDB, Pinecone, DuckDB, Neo4j, AWS DynamoDB, Redshift, Aurora, Oracle, PostgreSQL, MSSQL, MySQL, or MongoDB AWS or Google Certification such as Data Analytics, Machine Learning Engineer, or More ❯
using parallel computing frameworks (e.g. deeplearing4j, Torch, Tensor Flow, Caffe, Neon, NVIDIA CUDA Deep Neural Network library (cuDNN), and OpenCV) and distributed data processing frameworks (e.g. Hadoop (including HDFS, Hbase, Hive, Impala, Giraph, Sqoop), Spark (including MLib, GraphX, SQL and Dataframes). Execute data science method using common programming/scripting languages: Python, Java, Scala, R (statistics). Prior More ❯
learning libraries/packages (sklearn, TensorFlow, PyTorch, statsmodels, etc.). Experience in multiple tools/language/frameworks within the Big Data & cloud ecosystem (Hadoop, MongoDB, Neo4j, Spark, Hive, HBase, Cassandra, etc.). Demonstrated experience of managing and mentoring teams of data scientists, ML engineers, and data engineers on the execution of specific business use cases for AI/ More ❯
Preferred Qualifications: Experience with visualization tools and techniques (e.g., Periscope, Business Objects, D3, ggplot, Tableau, SAS Visual Analytics, PowerBI). Experience with big data technologies (e.g., Hadoop, HIVE, HDFS, HBase, MapReduce, Spark, Kafka, Sqoop). Master's degree in mathematics, statistics, computer science/engineering, or other related technical fields with equivalent practical experience. Experience constructing and executing queries More ❯
benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, ApacheHBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc.) Proficiency in software development More ❯
C/C++, Scala, Groovy, Python, and/or shell scripting Javascript development experience with Angular, React, ExtJS and/or Node.js Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi. DevOps experience building and More ❯
Reduce programming model and technologies such as Hadoop, Hive, Pig, Hadoop Distributed File System (HDFS), etc. Proficiency and work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, Accumulo, Big Table, etc. Proficiency and work experience with Serialization data interchange formats such as JSON and/or BSON. Proficiency and work experience in the design and development More ❯
work alongside Black Lotus Labs advanced security researchers, data engineers, malware reverse engineers, data scientists, and our customers to tackle evolving threats accelerated by technologies like our Hadoop ecosystem (HBase, HDFS, Spark, Kafka, AirFlow), Elasticsearch and Redis clusters, Docker using Docker Swarm, malware environment, and a network of honeypots. This is a close-knit, experienced, amazingly smart team that More ❯
work alongside Black Lotus Labs advanced security researchers, data engineers, malware reverse engineers, data scientists, and our customers to tackle evolving threats accelerated by technologies like our Hadoop ecosystem (HBase, HDFS, Spark, Kafka, AirFlow), Elasticsearch and Redis clusters, Docker using Docker Swarm, malware environment, and a network of honeypots. This is a close-knit, experienced, amazingly smart team that More ❯
C/C++, Scala, Groovy, Python, and/or shell scripting Javascript development experience with Angular, React, ExtJS and/or Node.js Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi. DevOps experience building and More ❯
on NGINX, Apache, or similar Familiar with a JavaScript framework like Backbone, Angular, Ember or Vue.js Familiar with Bootstrap and Less Experience with NoSQL storage like ElasticSearch, Accumulo, MongoDB, HBase is a plus. CONTACT: to learn MORE! Great Medical, Dental & Vision Benefits 401K Matching Competitive Salary Generous Time Off Short Term and Long Term Disability Life Insurance More ❯
Linux environments, including process and network diagnostics. Exposure to messaging, monitoring, and container orchestration (Kafka, RabbitMQ, Kubernetes, Docker, etc.). Solid knowledge of SQL/NoSQL databases (MongoDB, Cassandra, HBase, etc.). Desirable Skills and Experience Scala functional or concurrency libraries (ZIO, Akka Streams, Monix). Knowledge of C++ or Rust for performance-critical systems. Experience with CQRS architecture More ❯
Linux environments, including process and network diagnostics. Exposure to messaging, monitoring, and container orchestration (Kafka, RabbitMQ, Kubernetes, Docker, etc.). Solid knowledge of SQL/NoSQL databases (MongoDB, Cassandra, HBase, etc.). Desirable Skills and Experience Scala functional or concurrency libraries (ZIO, Akka Streams, Monix). Knowledge of C++ or Rust for performance-critical systems. Experience with CQRS architecture More ❯
and performing ETL tasks in Linux and/or Cloud environments. Preferred Qualifications Demonstrated experience delivering solutions using Cloud technologies, such as AWS, Microsoft Azure, etc. Experience with Hadoop, Hbase, MapReduce. Experience with Elasticsearch. Experience working in a mission environment and/or with many different types of data. Company EEO Statement Accessibility/Accommodation: If because of a More ❯
may be substituted for a bachelor's degree. Discretionary Requirements -Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work More ❯
java and other high-level languages such as C, C++ Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needssuch as Hbase, CloudBase/Acumulo, Big Table. Shall have demonstrated work experience with the Map Reduce programming model and technologiessuch as Hadoop, Hive, Pig. Shall have demonstrated work experience with the More ❯
as JDK, J2EE, EJB, JDBC, and/or Spring, and experience with RESTful APIs Experience developing and performing ETL tasks in a Linux environment Preferred Qualifications: Experience with Hadoop, Hbase, MapReduce Experience with Elasticsearch Experience with NiFi, Kafka, and Zookeeper Clearance Requirements: An active TS/SCI with Polygraph Physical Requirements: Use hands to operate a computer and other More ❯
complex problems. Nice to Haves: AWS certification or Security+ certification. Relevant IT discipline certifications (e.g., Java, .NET, Hadoop, Spring). Cloud Experience: Familiarity with cloud technologies such as Hadoop, HBase, or MongoDB. Independent and Collaborative Worker: Ability to function effectively both independently and in team settings. About us: We are an experienced advanced analytic development company providing Cyber solutions More ❯
Strong communication skills to translate stakeholder requirements into system use-cases. Experience with visualization tools (e.g., Tableau, D3, ggplot). Experience utilizing multiple big data technologies: Hadoop, Hive, HDFS, HBase, MapReduce, Spark, Kafka, Sqoop. Experience with SQL, Spark, ETL. Experience with extracting, cleaning, and transforming large transactional datasets to build predictive models and generate supporting documentation. TS/SCI More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
A1C Partners
Load Balancing and Enterprise Service Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and favorable suitability based on a CBP Background More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
TAIG (Tactical Analytic & Intelligence Group)
Load Balancing and Enterprise Service Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Good understanding of DevSecOps practice and tools • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and More ❯
cloud hosting environments; Support the transition of operational systems and applications from traditional platforms to cloud environments, support cloud-related data management tools and OpenSource (NoSQL) products such as Hbase, CloudBase/Acumulo, and Big Table; Convert existing algorithms or develop new algorithms to utilize the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig; Support More ❯
and modifying existing software to correct defects, adapt to new hardware, or improve performance, ensuring integration with Hadoop Distributed File System (HDFS) environments and distributed Big Data stores (e.g., HBase, CloudBase/Accumulo, Big Table). Must have the ability to develop simple data queries and complex database or data repository interfaces for existing and proposed databases, utilizing serialization More ❯
be substituted for a bachelor's degree. Active TS/SCI security clearance with a current polygraph is required.Preferred Qualifications: Demonstrated experience with Open Source (NoSQL) products such as HBase, Accumulo, BigTable, etc. At least six (6) years of experience within the most recent eight (8) years developing production software in Solaris or Linux environments. Six (6) years of More ❯
similar degree will be considered as a technical field. Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, Accumulo, Big Table, et cetera. One Cloud Developer Certification is required: o AWS Certified Developer-Associate o AWS DevOps Engineer Professional o Certified Kubernetes Application Developer (CKAD) o Elastic More ❯