provisioning tools such as Ansible or Saltstack Highly Desired Knowledge, Skills, and Abilities: Experience working with streaming data and automation systems such as Kafka, NiFi, RabbitMQ, or similar Experience working with HELM (Kubernetes Package Manager) Experience working with S3 compliant data stores (AWS S3, Azure Blob, MinIO) Experience working More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, ApacheNiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in … working under-standing of big data search tools (Airflow, Pyspark, Trino, OpenSearch, Elastic, etc.) Desired Skills (Optional) Docker Jenkins Hadoop/Spark Kibana Kafka NiFi ElasticSearch About The DarkStar Group Our Company The DarkStar Group is a small business that solves BIG problems. We're one of the Inc. More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, ApacheNiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, ApacheNiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in More ❯
Required Skills may include: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Informa?ca and other custom so?ware components (JAVA, etc.) Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture Elas?cSearch AWS SQS Linux Admin, VMWare vSphere 6/7 experience, SPLUNK More ❯
degree. Proven experience as an ETL Engineer, Data Engineer, or in a similar data engineering role. Strong knowledge of ETL tools and frameworks (e.g., ApacheNiFi, Talend, Informatica, Microsoft SSIS, Apache Airflow, etc.). Proficiency in SQL and experience working with relational databases (e.g., MySQL, PostgreSQL, Oracle … and data quality practices. Familiarity with version control systems (e.g., Git).Preferred Skills & Qualifications: Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, AWS Kinesis, etc.). Familiarity with data governance and compliance requirements (e.g., GDPR, HIPAA). Knowledge of data visualization tools (e.g., Tableau, Power More ❯
cloud application deployment across AWS, Google Cloud, and Microsoft Azure -Design, build, and maintain extract, transform and load (ETL) workflows using tools such as ApacheNiFi -Develop visualization suite using open source and licensed metrics and monitoring tools -Integrate complex applications with external systems, services, and data providers … in a team environment. -Excellent verbal and written communication skills. Preferred Qualifications: -TS/SCI w/Poly -Experience with relational databases -Experience with Apache Spark -Experience in an Agile development environment -AWS Cloud Certifications (Associate or Professional More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, ApacheNiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in More ❯
but not mandatory to perform the work, include: working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase with ApacheNiFi with the Extract, Transform, and Load (ETL) processes managing and mitigating IT security vulnerabilities using Plans of Actions and Milestones (POAMs) applying More ❯
Computer Science, Engineering, or related field. Hands-on experience with cloud platform technologies, preferably AWS. Familiarity with big data tools such as Elasticsearch, Splunk, NiFi, Kafka, etc. Background in Agile/Scrum methodologies for iterative development. Strong familiarity with NoSQL/SQL databases (MongoDB, PostgreSQL, etc.). Experience working More ❯
Experience with containerization desired (ex: Docker, Kubernetes) Experience with Linux (Red Hat and CentOS) Preferred Skillsets: Exposure to some of the following: Terraform Kubernetes NiFi Java Python Clearance : Active TS/SCI with an appropriate polygraph is required to be considered for this role Salary range: $ 101,996.00 More ❯
use big data for good. Qualifications You Have: 3+ years of experience using Python, SQL, and PySpark 3+ years of experience utilizing Databricks or Apache Spark Experience designing and maintaining Data Lakes or Data Lakehouses Experience with big data tools such as Spark, NiFi, Kafka, Flink, or others … Secret Clearance HS diploma or GED DoD8570 IAT II Compliance Certification (Such as Security+, CCNA Security, GSEC, etc.) Nice if You Have: Experience with ApacheNiFi, multi-cluster or containerized environment experience preferred Knowledge of cybersecurity concepts, including threats, vulnerabilities, security operations, encryption, boundary defense, auditing, authentication, and … benefits. Applicants should apply via our internal or external career site. Clearance Level Secret Job Locations US-MD-Annapolis Junction Skills Data migration, ETL, Nifi, Python More ❯
and other Kafka ecosystem tools Kafka knowledge of/with: Kafka Schema Security Manager, Kafka Schema Registry, and Kafka architecture and internals Experience with ApacheNiFI Experience with AWS Experience with MongoDB, Redis Experience with agile development methodologies Data Pipeline Experience: You will be involved with acquiring and More ❯
years or more experience; PhD or JD and fifteen years or more experience. Experience with tools for data routing and transformation such as NiFi, Airflow, or dbt. Strong in Java, Python, or SQL ETL/ELT data pipelines development supporting data-driven applications. Proficient in scripting such as Linux More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, ApacheNiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in … Optional) Knowledge of agile methodologies/experience delivering on agile teams (Participates in scrum and PI Planning) Docker, Jenkins, Hadoop/Spark, Kibana, Kafka, NiFi, ElasticSearch About The DarkStar Group Our Company The DarkStar Group is a small business that solves BIG problems. We're one of the Inc. More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, ApacheNiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in … Optional) Knowledge of agile methodologies/experience delivering on agile teams (Participates in scrum and PI Planning) Docker, Jenkins, Hadoop/Spark, Kibana, Kafka, NiFi, ElasticSearch About The DarkStar Group Our Company The DarkStar Group is a small business that solves BIG problems. We're one of the Inc. More ❯
Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., ApacheNiFi, Talend, Informatica). • Proficiency in data integration tools and technologies. • Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) is More ❯
one or more of the following is desired, but not required: Kubernetes, Docker Swarm, ELK stack, Python, Shell Scripting, Java, Spring, Kafka, Pega, Nexus, Nifi, Ansible, Terraform. More ❯
d like you to have Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience More ❯
with Amazon Web Managed Services (AWS) Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC More ❯
Amazon Web Managed Services (AWS). Working knowledge of software platforms and services such as Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar. Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML. Working knowledge of datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC More ❯
and troubleshooting.Desired Skills (Optional) Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. Demonstrated experience managing and mitigating IT security vulnerabilities using Plans of Actions and More ❯
with DevSecOps solutions and tools. Experience with Data Quality and Data Governance concepts and experience. Experience maintaining, supporting, and improving the ETL process using ApacheNiFi or similar tools. Experience with Apache Spark. Equal Opportunity Employer/Veterans/Disabled Accommodations: If you are a qualified individual More ❯
with Maven, Rancher, Sonarqube Experience with GIT, Ruby, ArgoCD Experience with Terraform AWS Cloud Computing Skills, AWS CloudWatch, Elastic Search, Prometheus, Grafana Familiarity with NIFI, Apache, Kafka Kalman Filtering Education/Experience Requirements SME Level: Expert consultant to top management typically with an advanced degree and 13+ years More ❯
NoSQL DBs, including Mongo, ElasticSearch, Redis, or Graph DB Experience with data wrangling, including Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating, with ApacheNiFi or related tools Experience with CI/CD, including Jenkins, Junit testing or related Experience with DevOps, including Packer, Terraform, or Ansible More ❯