cloud platforms such as AWS, GCP, or Azure. Proficiency with infrastructure-as-code tools like Docker, Kubernetes, and Terraform. Advanced experience with big data technologies, such as Apache Spark, Hadoop, and Kafka. Familiarity with ML observability and orchestration tools, including MLflow, Kubeflow, and Airflow. Solid understanding of version control systems, CI/CD, and DevOps best practices. Strong collaboration More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
Qualifications Familiarity with the cyber domain Experience with monitoring, logging, and alerting tools (e.g., Prometheus, Grafana). Experience querying databases (SQL, Hive). Experience working with data platforms like Hadoop and Spark. Overview Noblis and our wholly owned subsidiaries, Noblis ESI , and Noblis MSD tackle the nation's toughest problems and apply advanced solutions to our clients' most critical More ❯
Preferred Skills: • Experience working in federal or regulated environments • Prior hands-on Foundry work (creating pipelines, establishing objects, monitoring asset health) • Familiarity with big data tools such as Spark, Hadoop, or Databricks • Strong communication, analytical, and troubleshooting skills Thanks and Regards Murali Sharma More ❯
supervised, unsupervised, reinforcement learning), deep learning, and neural networks. Experience with data preprocessing, feature engineering, and model evaluation techniques.\Familiarity with data management and big data technologies (e.g., Spark, Hadoop, SQL, NoSQL databases). Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and their AI/ML services. Strong analytical and problem-solving skills. Excellent communication and collaboration skills More ❯
application development including Python, or Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Databricks) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB) 2+ years of data warehousing More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
experience in application development in Python 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 2+ year experience working on real-time data and streaming applications 2+ years of experience with NoSQL implementation (DynamoDB, Cassandra) 2+ years More ❯
transformation and workload management Experience with development of REST APIs, access control, and auditing Experience with DevOps pipelines Experience using the following software/tools: Big Data tools: e.g. Hadoop, Spark, Kafka, ElasticSearch Data Lakes: e.g. Delta Lake, Apache Hudi, Apache Iceberg Distributed Data Warehouse Frontends: e.g. Apache Hive, Presto Data pipeline and workflow management tools: e.g Luigi, Airflow More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Elder Research, Inc
Provide technical leadership and contribute to all phases of the software development lifecyclefrom design to deployment. Required Skills/Experience: Hands-on experience with data engineering tools such as Hadoop, Cloudera, and Apache Spark. Proficiency with AWS services including EMR Studio. Familiarity with CI/CD pipelines, GitHub, and version control workflows. Experience working with or maintaining an Analytics More ❯
platforms - Lakehouse architectures and federated queries Extensive experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, Data Lakes and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
platforms - Lakehouse architectures and federated queries Extensive experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, Data Lakes and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
platforms - Lakehouse architectures and federated queries Extensive experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, Data Lakes and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
london (city of london), south east england, united kingdom
Computappoint
platforms - Lakehouse architectures and federated queries Extensive experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, Data Lakes and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Gridiron IT Solutions
Python, SCALA, and/or UNIX shell scripting Expertise in machine learning techniques and statistical analysis Proficiency in SQL and NoSQL databases Experience with big data platforms such as Hadoop, Spark, and Kafka Cloud computing expertise across AWS, Azure, and other Experience in designing and implementing real-time data processing solutions Strong understanding of AI/ML applications in More ❯
Charlotte, North Carolina, United States Hybrid / WFH Options
City National Bank
years Advanced Java, R, SQL, Python coding Minimum 6+ years statistical Analysis, Machine Learning, Computer Science, Programming, Data Storytelling Minimum 6+ years big Data technologies such as Spark, AWS, Hadoop including traditional RDBMS such as Oracle and SQL Server Minimum 6+ years of data mining (preferably in a data-intensive financial company) Additional Qualifications Proficient experience in machine learning More ❯
AWS (commercial, gov cloud, secure environments). You must be proficient with Kubernetes/Microservice-based architecture (e.g., OpenShift, EKS, Docker), managed services, and large-scale processing environments (e.g. Hadoop/Spark/MapReduce) Languages and Frameworks : Expertise in common object-oriented and scripting languages, with primary skills in Java, Python, and JavaScript (React, Angular). Experience with OpenLayers More ❯
AWS (commercial, gov cloud, secure environments). You must be proficient with Kubernetes/Microservice-based architecture (e.g., OpenShift, EKS, Docker), managed services, and large-scale processing environments (e.g. Hadoop/Spark/MapReduce) Languages and Frameworks : Expertise in common object-oriented and scripting languages, with primary skills in Java, Python, and JavaScript (React, Angular). Experience with OpenLayers More ❯