Base Programming, Advanced Programming, Data Integration) Experience with SAS Visual Analytics and SAS Enterprise Miner Knowledge of machine learning frameworks and advanced statistical modeling Familiarity with big data technologies (Hadoop, Spark, NoSQL databases) Experience with version control systems (Git) and collaborative development practices Understanding of data privacy regulations and compliance frameworks Experience with the Department of Defense or similarly More ❯
of the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and More ❯
operational decision support and analyzing unstructured data (e.g., text, imagery). Ability to architect and maintain scalable data lakes, data warehouses, or distributed storage systems (e.g., Delta Lake, Snowflake, Hadoop, or NoSQL solutions). Demonstrated understanding of data security, privacy, and sovereignty issues, particularly in military or international environments, ensuring compliance with NATO operational and ethical standards. Experience building More ❯
utilizing SQL or Scala Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience working on real-time data and streaming applications Experience with NoSQL implementation using MongoDB or Cassandra Experience with data warehousing, including AWS Redshift More ❯
pipelines and ETL - informatica - Experience in SQL and database management systems - Knowledge of data modelling , warehousing concepts , and ETL processes - Experience with big data technologies and frameworks such as Hadoop, Hive, Spark. Programming experience in Python or Scala. - Demonstrated analytical and problem-solving skills. - Familiarity with cloud platforms (e.g Azure , AWS ) and their data related services - Proactive and detail More ❯
including data scientists, DBAs, and business analysts Strong verbal and written communication skills for documenting and presenting solutions Desired Qualifications: Experience integrating Oracle databases with big data platforms (e.g., Hadoop, Spark, or cloud-based data lakes) Familiarity with Oracle Big Data SQL or Oracle Autonomous Database Exposure to cloud platforms such as Oracle Cloud Infrastructure (OCI), AWS, or Azure More ❯
including data scientists, DBAs, and business analysts. Strong verbal and written communication skills for documenting and presenting solutions Preferred Qualifications: Experience integrating Oracle databases with big data platforms (e.g., Hadoop, Spark, or cloud-based data lakes) Familiarity with Oracle Big Data SQL or Oracle Autonomous Database. Exposure to cloud platforms such as Oracle Cloud Infrastructure (OCI), AWS, or Azure More ❯
basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience working on real-time data and streaming applications Experience with NoSQL implementation, including MongoDB or Cassandra Experience with data warehousing using AWS Redshift More ❯
basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience working on real-time data and streaming applications Experience with NoSQL implementation, including MongoDB or Cassandra Experience with data warehousing using AWS Redshift More ❯
to end projects independently Advanced experience building cloud scalable, real-time and high-performance data lake solutions, preferably Databricks, Snowflake, AWS Advanced Experience with big data technologies such as: Hadoop, Hive, Spark, EMR and orchestration tools like Airflow Advanced experience in SQL and modern scripting or programming languages, such as Python, Shell Experience in CI/CD Pipeline for More ❯
agile development methodologies. Preferred Qualifications Familiarity with event-driven architecture and messaging systems (Kafka, RabbitMQ). Experience with feature stores and model registries. Familiarity with big data technologies (Spark, Hadoop) Knowledge of monitoring and logging tools for machine learning models (Prometheus, Grafana, ELK stack). Significant experience with petabyte scale data sets. Significant experience with large-scale, multi-INT More ❯
Stay abreast of emerging technologies and best practices in data engineering, cloud computing, and the space industry.Desired Qualifications: Experience with other big data technologies (e.g., Apache Kafka, Apache Spark, Hadoop). Familiarity with cloud platforms (AWS, Azure, GCP) and their data services. Experience with containerization technologies (Docker, Kubernetes). Knowledge of data warehousing concepts and methodologies. Experience with version More ❯
unstructured datasets for analysis. Automate ETL workflows and streamline repetitive data preparation tasks using Python, SQL, and scripting tools. Operate in big data ecosystems using tools such as Spark, Hadoop, or their cloud-native equivalents (e.g., AWS Glue, Azure Synapse, Databricks). Assist in the development and deployment of data pipelines in collaboration with data engineering and DevOps teams. More ❯
years of experience in data engineering, data architecture, or related roles. Extensive experience with ETL tools such as AWS Glue or Talend Knowledge of big data technologies like Hadoop, Spark, and Kafka. Experience with cloud services from providers such as AWS or Oracle and their data storage and processing services Experience with data warehousing, data lakes, and big data More ❯
with containerization (Docker), cloud platforms, and CI/CD pipelines (Preferred). Skills: • Excellent communication skills and ability to work independently under program direction • Big data framework: Experience with Hadoop, Spark, Hive or similar technologies • Data Governance & Security Abilities: • Cloud Platform: Experience with AWS, Azure, OCI or Google Cloud. • Statistical analysis and machine learning: Able to build models that More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯
Reston, Virginia, United States Hybrid / WFH Options
Noblis
Qualifications Familiarity with the cyber domain Experience with monitoring, logging, and alerting tools (e.g., Prometheus, Grafana). Experience querying databases (SQL, Hive). Experience working with data platforms like Hadoop and Spark. Overview Noblis and our wholly owned subsidiaries, Noblis ESI , and Noblis MSD tackle the nation's toughest problems and apply advanced solutions to our clients' most critical More ❯
supervised, unsupervised, reinforcement learning), deep learning, and neural networks. Experience with data preprocessing, feature engineering, and model evaluation techniques.\Familiarity with data management and big data technologies (e.g., Spark, Hadoop, SQL, NoSQL databases). Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and their AI/ML services. Strong analytical and problem-solving skills. Excellent communication and collaboration skills More ❯
Liverpool, Merseyside, North West, United Kingdom Hybrid / WFH Options
Forward Role
both relational databases (SQL Server, MySQL) and NoSQL solutions (MongoDB, Cassandra) Hands-on knowledge of AWS S3 and associated big data services Extensive experience with big data technologies including Hadoop and Spark for large-scale dataset processing Deep understanding of data security frameworks, encryption protocols, access management and regulatory compliance Proven track record building automated, scalable ETL frameworks and More ❯
clusting, dimensionality reduction, deep learning architectures). Experience with data preprocessing, feature engineering, and data visualization techniques. Familiarity with data storage and processing technologies (e.g., SQL, NoSQL databases, Spark, Hadoop). Experience with cloud platforms (e.g., AWS, Azure, GCP) and their machine learning services. Understanding of software development principles, version control (e.g., Git), and CI/CD pipelines. Strong More ❯
Platform, BigQuery , Informatica Power Center and Informatica Intelligent Cloud Services (IICS) or similar tools. 3-6 years in Data Warehouse platforms such as Teradata(or similar databases) , BigData/Hadoop(or similar technologies), BigQuery At least 3-5 years of experience in data modeling (logical and/or physical) 3-5 years of hands-on experience working with near More ❯
only a U.S. citizen can obtain Education: Bachelor's Degree in computer, information systems or related field Advanced proficiency in SQL and NoSQL databases Experience with Apache Accumulo, ApacheHadoop Experience with Python Experience with Docker, AWS and/or Azure Hands-on experience with Apache Kafka, Apache NiFi Experience developing data tools that interface with multiple message types More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
Qualifications Familiarity with the cyber domain Experience with monitoring, logging, and alerting tools (e.g., Prometheus, Grafana). Experience querying databases (SQL, Hive). Experience working with data platforms like Hadoop and Spark. Overview Noblis and our wholly owned subsidiaries, Noblis ESI , and Noblis MSD tackle the nation's toughest problems and apply advanced solutions to our clients' most critical More ❯
Arlington, Virginia, United States Hybrid / WFH Options
CGI
/hardware development when processing large data sets • Experience in the following required task areas: machine learning, statistical modeling, time-series forecasting, and/or geospatial analytics • Experience with Hadoop, Spark, or other parallel computing processes • Experience with using the Joint Enterprise Modeling and Analytics (JEMA) framework or Apache NiFi to develop data processing workflows • Experience in AWS services More ❯