Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
algorithms that learn, predict, and make informed Expert knowledge of multiple programming languages (i.e. Python, Java, C++ decisions, meeting the evolving needs of our customers. With strong familiarity with Hadoop, we develop and deploy scalable AI/ML models that integrate with data pipelines, infrastructure, and applications. WHY JOIN US Providing ongoing training, mentorship, and development opportunities to help More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
ML algorithms that learn, predict, and make informed Expert knowledge of multiple programming languages (i.e. Python, Java, C++decisions, meeting the evolving needs of our customers. With strong familiarity with Hadoop, we develop and deploy scalable AI/ML models that integrate with data pipelines, infrastructure, and applications. WHY JOIN US Providing ongoing training, mentorship, and development opportunities to help More ❯
annapolis junction, maryland, united states Hybrid / WFH Options
Lockheed Martin
ML algorithms that learn, predict, and make informed Expert knowledge of multiple programming languages (i.e. Python, Java, C++decisions, meeting the evolving needs of our customers. With strong familiarity with Hadoop, we develop and deploy scalable AI/ML models that integrate with data pipelines, infrastructure, and applications. WHY JOIN US Providing ongoing training, mentorship, and development opportunities to help More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Wyetech, LLC
Excellent oral and written communication skills. Understanding of AGILE software development methodologies and use of standard software development tool suites Desired Technical Skills Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus The Benefits Package Wyetech believes in generously More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Wyetech, LLC
Excellent oral and written communication skills. Understanding of AGILE software development methodologies and use of standard software development tool suites. Desired Technical Skills Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus Work could possibly require some on-call More ❯
Excellent oral and written communication skills Understanding of AGILE software development methodologies and use of standard software development tool suites Desired Technical Skills Experience with big data technologies like: Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers EKS, Diode, CI/CD, and Terraform are a plus The Benefits Package Wyetech believes in generously More ❯
We are seeking a highly experienced and skilled Senior Data Lake Engineer to join our team. As the Senior Data Lake Engineer, you will play a critical role in establishing and configuring an enterprise-level Databricks solution to support our More ❯
and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, ApacheHadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc.) Proficiency in More ❯
Kubernetes). Experience with Enabling tools (Git, Maven, Jira), DevOps (Bamboo, Jenkins, GitLab Cl/Pipelines), Continuous Monitoring (ELK Stack (ElasticSearch, Logstash and Kibana), Nagios). Experience with ApacheHadoop, Apache Accumulo and Apache NiFi. Well-grounded in Linux fundamentals and familiarity with scripting languages (e.g., Python, Ruby, Perl, BASH, etc.). Experience with AWS Cloud architecture, Infrastructure as More ❯
or similar open-source projects. Excellent communication skills and ability to collaborate effectively with both internal teams and the open-source community. Experience with big data technologies such as Hadoop, Spark, Kafka, Flink. The base pay range for this position is expected in the range below: $115,200 - $199,700 Base pay offered may vary depending on multiple individualized More ❯
Linux skills and familiarity with hybrid cloud/on-prem architecture, AWS, C2S, Openstack, etc. Experience with some big data technologies such as Kubernetes, Spark, Hive, and/or Hadoop, Accumulo, ElasticSearch Experience with Apache NiFi, Apache Airflow, or Kafka An adaptable and solution centric mindset that embraces technology enablers. Familiarity with common industry software tools, concepts, and DevSecOps More ❯
één of meerdere van deze zaken, dan is dat een meerwaarde: learning frameworks (TensorFlow, PyTorch of Keras) big data analyses via cloud computing (AWS of MS Azure) Apache Spark, Hadoop, Kubernetes Cloudera Git Heb je al ervaring met data science in een business context, dan spitsen we helemaal onze oren. Maar ook als je die ervaring niet hebt leren More ❯
programs and analyzing current processing for help desk inquiries as well as determining issue fixes. Database: Oracle & MS SQL UI: JavaScript, CSS, HTML, Angular Cloud: AWS Big Data: ApacheHadoop Operating System: Linux and Windows server Database: Oracle, MS SQL SCHEDULER: Control-M MIDDLEWARE: MQ SERIES & Kafka Source Code Repository: GIT Source File Transfer: FTP, sFTP, Connect:Direct (formerly More ❯
Skills: • Experience with MapReduce and Pig is highly desired • Experience with Data Science • Experience with Graphic Algorithms • Experience with Machine Learning • Experience with AWS • Cloud development experience such as Hadoop, Big Data (Cloudbase/Accumulo and Big Table) as well as JSON/BSON • Experience with analytic development • Experience with Python and streaming capabilities • Experience with Software Configuration Management More ❯
and vetting mission. 4. Demonstrated experience in micro service architecture using Spring Framework, Spring Boot, Tomcat, AWS, Docker Container or Kubernetes solutions. 5. Demonstrated experience in big data solutions (Hadoop Ecosystem, MapReduce, Pig, Hive, DataStax, etc.) in support of a screening and vetting mission. More ❯
with languages such as, Java, Python, and TypeScript. -Participate in software development to support innovative and enhancing software of customer applications. -Experience with big data tools such as Spark, Hadoop, Cassandra, DynamoDB, Kinesis, SOLR, Elasticsearch. -Experience with at least one of the following HTML, CSS, JavaScript, and at least a modern framework such as Angular, React or Vue. -Experience More ❯
Requirements A TS/SCI level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.) Understanding More ❯
Requirements A TS/SCI level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.) Understanding More ❯
San Antonio, Texas, United States Hybrid / WFH Options
HII Mission Technologies
CASP+ CE, CISSP, CSSLP or CCSP is highly desired) Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.) Understanding More ❯
C#, C++). Experience with data transport and transformation APIs and technologies such as JSON, XML, XSLT, JDBC, SOAP and REST. Experience with Cloud-based data analysis tools including Hadoop and Mahout, Acumulo, Hive, Impala, Pig, and similar. Experience with visual analytic tools like Microsoft Pivot, Palantir, or Visual Analytics. Experience with open-source textual processing such as Lucene More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
required in this position as directed by the customer. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.) Understanding More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Leidos
Experience blending analytical methodologies and leveraging existing COTS/GOTS/OS tools in an unconventional manner. Familiarity utilizing virtualization and distributed field systems, such as virtual environments and Hadoop (or similar distributed file systems) in development and deployment environments. Experience with Amazon Web Services (AWS/C2S) Familiarity with hardware platforms, e.g., CPUs, GPUs, FPGAs, etc. Familiarity or More ❯
workplace and at home, there's nothing we can't achieve. BASIC QUALIFICATIONS - 10+ years of technical specialist, design and architecture experience - 10+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 10+ years of consulting, design and implementation of serverless distributed solutions experience - Australian citizen with ability to obtain security clearance. PREFERRED QUALIFICATIONS - AWS Professional level More ❯
home, there's nothing we can't achieve in the cloud. Basic Qualifications - 7+ years of technical specialist, design and architecture experience - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 7+ years of consulting, design and implementation of serverless distributed solutions experience - 5+ years of software development with object oriented language experience - 3+ years of More ❯
home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS - 7+ years of technical specialist, design and architecture experience - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 7+ years of consulting, design and implementation of serverless distributed solutions experience - 5+ years of software development with object oriented language experience - 7+ years of More ❯