Fort Belvoir, Virginia, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
willing/able to help open/close the workspace during regular business hours as needed Preferred Requirements Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus We have many more additional great benefits More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
a hybrid environment. On average 1-2 days per week with ability to flex if needed. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
visits to our Columbia, MD office. Flexibility is essential to accommodate any changes in the schedule. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus We have More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
to flex if needed. Work could require some on-call work, so being flexible is key. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus We have More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
week at our Columbia, MD office. Flexibility is essential to adapt to schedule changes as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could More ❯
Fort Belvoir, Virginia, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus We have many more additional great benefits More ❯
large volumes of data Experience working with cloud providers such as Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure Experience with Hadoop, Spark, or other parallel storage/computing processes is a plus Physical Requirements: Prolonged periods of sitting at a desk and working on a More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
week at our Columbia, MD office. Flexibility is essential to accommodate any changes in the schedule. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could More ❯
computing frameworks (e.g. deeplearing4j, Torch, Tensor Flow, Caffe, Neon, NVIDIA CUDA Deep Neural Network library (cuDNN), and OpenCV) and distributed data processing frameworks (e.g. Hadoop (including HDFS, Hbase, Hive, Impala, Giraph, Sqoop), Spark (including MLib, GraphX, SQL and Dataframes). Execute data science method using common programming/scripting More ❯
or Amazon QuickSight. Programming Languages: Familiarity with Python or R for data manipulation and analysis. Big Data Technologies: Experience with big data technologies like Hadoop or Spark. Data Governance: Understanding of data governance and data quality management. A Bit About Us When it comes to appliances and electricals, we More ❯
technologies such as Snowflake, AWS and/or Azure would be hugely beneficial, though any background including SQL Server, MySQL, Postgres, NoSQL, Oracle or Hadoop would be great to see. In depth knowledge of database structures, data analysis and data mining, Strong understanding of data warehousing, data lakes, ETL More ❯
web framework for the backends and React for developing the client facing portion of the application Create, extract, transform, and load (ETL) pipelines using Hadoop and Apache Airflow for various production big data sources to fulfill intelligence data availability requirements Automate retrieval of data from various sources via API More ❯
REQUIRED THIS IS AN ON-SITE POSITION Required Skills may include: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Maintaining the ApacheHadoop Ecosystem, especially u?lizing HBase, MapReduce, and Spark. ETL processes u?lizing Linux shell scrip?ng, Perl, Python, and Apache Airflow. AWS services such More ❯
Enabling tools (Git, Maven, Jira), DevOps (Bamboo, Jenkins, GitLab Cl/Pipelines), Continuous Monitoring (ELK Stack (ElasticSearch, Logstash and Kibana), Nagios) Experience with ApacheHadoop, Apache Accumulo and Apache NiFi Well-grounded in Linux fundamentals and familiarity with scripting languages (e.g., Python, Ruby, Perl, BASH, etc.) Experience with AWS More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
e.g., Apache Airflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop) Familiarity with database technologies (PostgreSQL, MySQL, or NoSQL solutions) Ability to work in a fast-paced environment with large-scale datasets Preferred: • Experience with More ❯
structured and unstructured sources. Experience with analytical methodologies to diagnose challenges, implement solutions, and evaluate performance. Tool Knowledge: Experience with tools such as ApacheHadoop, HUE, Hive, Pig, Spark, Elasticsearch, Kibana, or Tableau. Educational Requirements: A Bachelor's degree with 1-3 years of experience, or a Master's More ❯
several areas of Data Mining, Classical Machine Learning, Deep Learning, NLP and Computer Vision. Experience with Large Scale/Big Data technology, such as Hadoop, Spark, Hive, Impala, PrestoDb. Hands-on capability developing ML models using open-source frameworks in Python and R and applying them on real client More ❯
and/or Distributed Computing. Bachelor's degree in Computer Science or related discipline from an accredited college or university is required. Cloudera Certified Hadoop Develop This position is contingent on funding and may not be filled immediately. However, this position is representative of positions within CACI that are More ❯
Understanding of algorithmic models such as Naive Bayes, linear discriminant analysis, Hidden Markov, and Gaussian mixture models. Familiarity with big data technologies such as Hadoop, Apache Spark, Cassandra, and MongoDB. Problem-solving skills. From fine-tuning algorithms to optimizing models, ability to dissect and resolve issues is paramount. It More ❯
and the ability to work in a fast-paced, collaborative environment. Strong communication and interpersonal skills. Preferred Skills: Experience with big data technologies (e.g., Hadoop, Spark). Knowledge of machine learning and AI integration with data architectures. Certification in cloud platforms or data management. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
JSS Search
and the ability to work in a fast-paced, collaborative environment. Strong communication and interpersonal skills. Preferred Skills: Experience with big data technologies (e.g., Hadoop, Spark). Knowledge of machine learning and AI integration with data architectures. Certification in cloud platforms or data management. More ❯
clustering and classification techniques. Fluency in a programming language (Python, C, C++, Java, SQL). Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau). More ❯
clustering and classification techniques. Fluency in a programming language (Python, C, C++, Java, SQL). Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau). More ❯
years, bachelor's with 8 years, master's with 6 years, or PhD with 4 years Deep expertise in big data platforms (e.g., Hadoop, Spark, Kafka) and multi-cloud environments (AWS, Azure, GCP) Experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn, PyTorch) Strong programming skills in Python, Java More ❯
statistical programming languages such as R, Python, MATLAB • Knowledge of geospatial analysis concepts and tools such as ArcGIS • Familiarity with big data technologies like Hadoop, Spark, NoSQL databases • Excellent problem-solving skills and ability to develop innovative solutions • Strong verbal and written communication skills Desired Qualifications: • 5+ years of More ❯