as Python, Java, or Scala for data processing. Strong knowledge of SQL and relational databases (e.g., MySQL, PostgreSQL, MS SQL Server). Experience with NoSQL databases (e.g., MongoDB, Cassandra, HBase). Familiarity with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Hands-on experience with ETL frameworks and tools (e.g., Apache NiFi, Talend, Informatica, Airflow). Knowledge More ❯
Head of Data & Analytics Architecture and AI page is loaded Head of Data & Analytics Architecture and AI Apply locations Chiswick Park time type Full time posted on Posted 30+ Days Ago job requisition id JR19765 Want to help us bring More ❯
architectures, Lambda type architectures - Proficiency in writing and optimizing SQL - Knowledge of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full More ❯
SKILLS AND QUALIFICATIONS: Experience with Linux-based systems. Database experience with PostgreSQL or other relational databases. Experience with Docker, Kubernetes, Helm, or other containerization tools. Familiarity with Kafka, Hadoop, HBase, or cloud-based big data solutions. Understanding of geospatial data, data fusion, and machine learning. Experience supporting Intelligence Community and DoD mission sets. CompTIA Security+ certification (or willingness to More ❯
using parallel computing frameworks (e.g. deeplearing4j, Torch, Tensor Flow, Caffe, Neon, NVIDIA CUDA Deep Neural Network library (cuDNN), and OpenCV) and distributed data processing frameworks (e.g. Hadoop (including HDFS, Hbase, Hive, Impala, Giraph, Sqoop), Spark (including MLib, GraphX, SQL and Dataframes). Execute data science method using common programming/scripting languages: Python, Java, Scala, R (statistics). Prior More ❯
and deployment) Linux and Windows operating systems: security, configuration, and management Database design, setup, and administration (DBA) experience with Sybase, Oracle, or UDB Big data systems: Hadoop, Snowflake, NoSQL, HBase, HDFS, MapReduce Web and Mobile technologies, digital workflow tools Site reliability engineering and runtime operational tools (agent-based technologies) and processes (capacity, change and incident management, job/batch More ❯
languages, such as Java, Scala, Python, and Golang. Supporting experience to execute against database technologies such as PostgreSQL. Supporting experience to execute against cloud technologies such as Hadoop, Kafka, HBase, Accumulo. Experienced with full software lifecycle development. PREFERRED SKILLS AND QUALIFICATIONS: CI/CD pipelines and tooling (Gitlab CI/CD, ArgoCD, CircleCI, Jenkins). Experience with sensor technologies More ❯
management tools including git via GitHub, GitLab, or BitBucket. Experience with drafting and maintenance of technical documentation Experience with RDBMS (ex: Oracle, Postgres, MySQL) and non-SQL DBs (ex: HBase, Accumulo, MongoDB, Neo4J) methodologies Familiarity with Cloud technologies such as Azure, AWS, or GCP. Experience working in a classified environment on operational networks such as SIPR or JWICS Desired More ❯
tools: Hibernate, SpringBoot, ExtJS, AngularJS, Ansible, Swagger, Git, Subversion, Maven, Jenkins, Gradle, Nexus, Eclipse, IntelliJ, Ext-Js, JQuery, and D3. Cloud technologies: Pig, Hive, Apache Spark, Azure DataBricks, Storm, HBase, Hadoop Distributed File System, and MapReduce Open-source virtual machines and Cloud-based This position is contingent on funding and may not be filled immediately. However, this position is More ❯
may be substituted for a bachelor's degree. Discretionary Requirements -Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work More ❯
Monitoring utilities, Disaster recovery process/tools Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, Spark Streaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware and Cabling set-up experience More ❯
Kubernetes. Experience with tools like Ansible, Terraform, Docker, Kafka, Nexus Experience with observability platforms: InfluxDB, Prometheus, ELK, Jaeger, Grafana, Nagios, Zabbix Familiarity with Big Data tools: Hadoop, HDFS, Spark, HBase Ability to write code in Go, Python, Bash, or Perl for automation. Work Experience 5-7+ years of proven experience in previous roles or one of the following More ❯
of experience in data engineering or related work. -Proficiency in Java, AWS, Python, Apache Spark, Linux, Git, Maven, and Docker. -Experience maintaining an Apache Hadoop Ecosystem using tools like HBase, MapReduce, and Spark. -Knowledge of ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. -Experience with AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS, SQS, SNS More ❯
orchestration for the data ingest pipeline to perform API service development and updates. Shall use the following technologies: Relational Data Stores (e.g., Oracle 21c), NiFi, Kafka, Elastic MapReduce (EMR) Hbase, Elastic, Splunk, Java, Python, and Spring to instrument and update the Data Catalog for data metrics, using Splunk and MySQL. REQUIRED QUALIFICATIONS Requires an active Top Secret/SCI … and management of migration efforts. Experience using and applying Splunk. Experience developing and deploying code in a cloud-based environment. Experience using Java, Python, Spring, Kafka, Elastic, and EMR Hbase DESIRED QUALIFICATIONS Experience using Jira, Elasticsearch, NiFi, MySQL, Oracle, Kubernetes or other container solution, or Apache Spark. Overview This role r equires an active Top Secret/SCI + More ❯
on NGINX, Apache, or similar Familiar with a JavaScript framework like Backbone, Angular, Ember or Vue.js Familiar with Bootstrap and Less Experience with NoSQL storage like ElasticSearch, Accumulo, MongoDB, HBase is a plus. CONTACT: to learn MORE! Great Medical, Dental & Vision Benefits 401K Matching Competitive Salary Generous Time Off Short Term and Long Term Disability Life Insurance More ❯
provide standard interfaces for analytics on large security-related datasets, and lead innovation by implementing data-centric technologies. You'll work with Big Data Technologies such as Hadoop/HBase, Cassandra, and BigTable while managing ETL processes and performing system administration and performance tuning. This role is contingent on contract award. What we are looking for: Associate: Bachelor's … or Master's degree with 8+ years of experience. Recognized authority providing innovative solutions to complex technical problems and leading advanced development efforts. Experience with Big Data technologies (Hadoop, HBase, Cassandra, BigTable), ETL processes, data platform architecture, cloud computing infrastructure (AWS), programming languages (Java, Python), and distributed systems design. Highly preferred: Open source project contributions. Experience with distributed RDBMS More ❯
Requirements: Systems Engineering (DevOps) The Contractor shall coordinate with multiple entities, including mission partners, to ensure tools meet defined requirements. The Contractor shall apply DevOps principles and philosophies to continuously deliver high value enhancements to software in a service-based More ❯
cloud hosting environments; Support the transition of operational systems and applications from traditional platforms to cloud environments, support cloud-related data management tools and OpenSource (NoSQL) products such as Hbase, CloudBase/Acumulo, and Big Table; Convert existing algorithms or develop new algorithms to utilize the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig; Support More ❯
complex problems. Nice to Haves: AWS certification or Security+ certification. Relevant IT discipline certifications (e.g., Java, .NET, Hadoop, Spring). Cloud Experience: Familiarity with cloud technologies such as Hadoop, HBase, or MongoDB. Independent and Collaborative Worker: Ability to function effectively both independently and in team settings. About us: We are an experienced advanced analytic development company providing Cyber solutions More ❯
Experience with Temporal or similar orchestration tools. • Messaging & Streaming: Exposure to RabbitMQ or similar message/streaming broker technologies. • Advanced Technologies: Interest or experience in big data technologies (e.g., HBase, Hadoop), machine learning frameworks (e.g., Spark), and orbit dynamics. Why Join Us? • Innovative Environment: Be part of projects at the cutting edge of space systems and security. • Agile Culture More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
A1C Partners
Load Balancing and Enterprise Service Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and favorable suitability based on a CBP Background More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
TAIG (Tactical Analytic & Intelligence Group)
Load Balancing and Enterprise Service Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Good understanding of DevSecOps practice and tools • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and More ❯
developing software with high level languages, such as Java, C, C++. Shall have at least four ( 4) years of experience with distributed scalable Big Data Store (NoSQL), such as HBase, CloudBase/Accumulo, Big Table, etc., as well as four (4) years of experience with the Map Reduce programming model, the Distributed File System (HDFS), and technologies such as More ❯