bachelor's degree. Discretionary Requirements -Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig More ❯
Ansible, Terraform, Docker, Kafka, Nexus Experience with observability platforms: InfluxDB, Prometheus, ELK, Jaeger, Grafana, Nagios, Zabbix Familiarity with Big Data tools: Hadoop, HDFS, Spark, HBase Ability to write code in Go, Python, Bash, or Perl for automation. Work Experience 5-7+ years of proven experience in previous roles More ❯
in either the Sheffield or Birmingham Office. Must Haves: Experience with cloud-based applications, preferably Azure. Proficiency in RDBMS (PostgreSQL, MySQL) and NoSQL (Cassandra, HBase, Elasticsearch). Build, operate, maintain, and support cloud infrastructure and data services. Automate and optimize data engineering pipelines. Utilize big data technologies (Databricks, Spark More ❯
in either the Sheffield or Birmingham Office. Must Haves: Experience with cloud-based applications, preferably Azure. Proficiency in RDBMS (PostgreSQL, MySQL) and NoSQL (Cassandra, HBase, Elasticsearch). Build, operate, maintain, and support cloud infrastructure and data services. Automate and optimize data engineering pipelines. Utilize big data technologies (Databricks, Spark More ❯
and assessments. Education, Experience and Qualifications: • Demonstrated experience with Hadoop. • Demonstrated experience understanding large distributed data systems, cloud infrastructure, and network architecture (Hadoop, Kafka, HBase, or Presto). • Demonstrated experience with cloud, data management, and development environments: Specifically, AWS, HDFS, Cloudera, and SQL (as well as Spark or Presto More ❯
and foreign stakeholders. Qualifications Required Qualifications: Demonstrated experience with Hadoop. Demonstrated experience understanding large distributed data systems, cloud infrastructure, and network architecture (Hadoop, Kafka, HBase, or Presto). Demonstrated experience with cloud, data management, and development environments: Specifically, AWS, HDFS, Cloudera, and SQL (as well as Spark or Presto More ❯
administrating, and operating two or more of the following technologies: SUSE Rancher and Kubernetes clusters , Elasticsearch, Cloudera Private Cloud Platform , Hadoop components (Hadoop, yarn, HBase, Impala, etc.), VMWare vSphere virtualization platforms. Knowledge or experience in High-availability solutions, load balancers, relational databases (PostgreSQL), monitoring systems (Nagios), automation, and deployment More ❯
administrating, and operating two or more of the following technologies: SUSE Rancher and Kubernetes clusters , Elasticsearch, Cloudera Private Cloud Platform , Hadoop components (Hadoop, yarn, HBase, Impala, etc.), VMWare vSphere virtualization platforms. Knowledge or experience in High-availability solutions, load balancers, relational databases (PostgreSQL), monitoring systems (Nagios), automation, and deployment More ❯
transition of operational systems and applications from traditional platforms to cloud environments, support cloud-related data management tools and OpenSource (NoSQL) products such as Hbase, CloudBase/Acumulo, and Big Table; Convert existing algorithms or develop new algorithms to utilize the Map Reduce programming model and technologies such as More ❯
AWS certification or Security+ certification. Relevant IT discipline certifications (e.g., Java, .NET, Hadoop, Spring). Cloud Experience: Familiarity with cloud technologies such as Hadoop, HBase, or MongoDB. Independent and Collaborative Worker: Ability to function effectively both independently and in team settings. About us: We are an experienced advanced analytic More ❯
the last 5 years, a minimum of 3 years experience with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as hbase, apache accumulo, and/or big table Within the last 3 years, a minimum of 1 year experience with requirements analysis and design for More ❯
Experience maintaining Jenkins CI/CD pipelines with automated testing and deployment. (Preferred) • Knowledge of big data processing and NoSQL databases (MongoDB, ElasticSearch, MapReduce, HBase). (Preferred) • Ability to manage, troubleshoot, and upgrade software, hardware, and networks. (Preferred) • Experience working with Apache NiFi for data flow automation. (Preferred) • Knowledge More ❯
Familiar with a JavaScript framework like Backbone, Angular, Ember or Vue.js Familiar with Bootstrap and Less Experience with NoSQL storage like ElasticSearch, Accumulo, MongoDB, HBase is a plus. CONTACT: to learn MORE! Great Medical, Dental & Vision Benefits 401K Matching Competitive Salary Generous Time Off Short Term and Long Term More ❯
Linux and/or Cloud environments. Preferred Qualifications Demonstrated experience delivering solutions using Cloud technologies, such as AWS, Microsoft Azure, etc. Experience with Hadoop, Hbase, MapReduce. Experience with Elasticsearch. Experience working in a mission environment and/or with many different types of data. Company EEO Statement Accessibility/ More ❯
We use Terraform, Ansible, k8s and Datadog to manage a range of RHEL/Rocky 9 hosts. Our analytics clusters make use of Spark, HBASE and HDFS. Experience in designing and implementing scalable infrastructure solutions, ideally with some exposure to parallel processing environments used for large-scale analytics. An More ❯
SCI eligibility are strongly preferred. Comfortable working with Linux systems on a daily basis. Experience maintaining data pipelines. Cloud technologies such as: Hadoop, Kafka, HBase, Accumulo. Interest in data mining, analytics, and/or machine learning. Familiarity with Intelligence Community and DoD mission sets. CompTIA Security+ certification or willingness More ❯
Bachelor's degree in Computer Science or related discipline. • Cloud Experience; demonstrated work experience with: o Distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc. o Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc. o Hadoop Distributed File System More ❯
frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • ElasticSearch • Postgres, DynamoDB, HBase, MongoDB, Cassandra • JQuery, AngularJS, NodeJS • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
A1C Partners
Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and favorable suitability More ❯
languages such as C, C++ Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needssuch as Hbase, CloudBase/Acumulo, Big Table. Shall have demonstrated work experience with the Map Reduce programming model and technologiessuch as Hadoop, Hive, Pig. Shall have More ❯
and/or Spring, and experience with RESTful APIs Experience developing and performing ETL tasks in a Linux environment Preferred Qualifications: Experience with Hadoop, Hbase, MapReduce Experience with Elasticsearch Experience with NiFi, Kafka, and Zookeeper Clearance Requirements: An active TS/SCI with Polygraph Physical Requirements: Use hands to More ❯
development. Active TS/SCI security clearance with a current polygraph is required. Preferred Qualifications: Demonstrated work experience with OpenSource (NoSQL) products such as Hbase/Accumulo, Big Table, etc. A minimum of six (6) years demonstrated experience out of the most recent eight (8) years developing production software More ❯
orchestration tools. • Messaging & Streaming: Exposure to RabbitMQ or similar message/streaming broker technologies. • Advanced Technologies: Interest or experience in big data technologies (e.g., HBase, Hadoop), machine learning frameworks (e.g., Spark), and orbit dynamics. Why Join Us? • Innovative Environment: Be part of projects at the cutting edge of space More ❯
experience in system engineering/architecture. Ten (10) years of experience working with products that support highly distributed, massively parallel computation needs such as Hbase, Hadoop, Acumulo, Big Table, Cassandra, Scality et cetera. At least seven (7) years of experience writing software scripts using scripting languages such as Perl More ❯