Washington, Washington DC, United States Hybrid / WFH Options
BLN24
Ability to manage multiple projects and priorities effectively. Preferred Skills: Experience with cloud-based data lake solutions (e.g., AWS, Azure, Google Cloud). Familiarity with big data technologies (e.g., Hadoop, Spark). Knowledge of data governance and security best practices. Experience with ETL processes and tools. What BLN24 brings to the Game: BLN24 benefits are game changing. We like More ❯
Washington, Washington DC, United States Hybrid / WFH Options
BLN24
solving and analytical skills. Strong communication and collaboration abilities. Preferred Skills: Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). Familiarity with big data technologies (e.g., Hadoop, Spark). Knowledge of data governance and security best practices. Experience with ETL processes and tools. What BLN24 brings to the Game: BLN24 benefits are game changing. We like More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
well. Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
well. Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
well. Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
Software Development. Proficiency in data mining, data analysis, and data visualization. Experience with cloud computing platforms such as AWS, Azure, or GCP. Experience with big data technologies such as Hadoop, Spark, and Hive. Familiarity with DevOps practices and tools. Excellent leadership and management skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative environment. Strong problem More ❯
AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a month More ❯
AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a month More ❯
leadership experience Proficiency in modern data platforms (Databricks, Snowflake, Kafka), container orchestration (Kubernetes/OpenShift), and multi-cloud deployments across AWS, Azure, GCP Advanced knowledge of Big Data ecosystems (Hadoop/Hive/Spark), data lakehouse architectures, mesh topologies, and real-time streaming platforms Strong Unix/Linux skills, database connectivity (JDBC/ODBC), authentication systems (LDAP, Active Directory More ❯
systems such as Git and understanding the concepts of branching, merging, and tagging is essential What we'd like you to have Knowledge of distributed computing technologies like Hive, Hadoop and Spark Experience with integrating data from various sources such as web services, APIs and file systems Advanced understanding of database schemas About BigBear.ai BigBear.ai is a leading provider More ❯
communication skills, including the ability to explain technical findings to non-technical stakeholders. Preferred Qualifications Master's or PhD in a quantitative discipline. Experience with big data tools (e.g., Hadoop, Spark) and cloud platforms. Familiarity with complex ETL pipelines. Experience with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn). Proven ability to leverage GenAI via prompt engineering. More ❯
in the schedule. Must be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Compensation At IAMUS More ❯
Qualifications: Experience working with classified government data or in a defense-related environment Familiarity with cloud platforms (AWS, Azure, etc.) and data processing tools Knowledge ofbig data technologies(e.g., Hadoop, Spark) Experience withDevOps practicesand version control (Git, GitHub) Strong problem-solving skills and ability to work in a fast-paced environment Machine Learning experience is a plus Company Summary More ❯
days/week. Must be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. We have many More ❯
Have: 2+ years of experience in the development of algorithms leveraging R, Python, or SQL/NoSQL 2+ years of experience with Distributed data and computing tools, including MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL 2+ years of experience with Machine Learning, AI or NLP Experience with visualization packages, including Plotly, Seaborn, or ggplot2 Master's degree More ❯
days/week. Must be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. We have many More ❯
and entity resolution. Preferred Qualifications: Experience with visualization tools and techniques (e.g., Periscope, Business Objects, D3, ggplot, Tableau, SAS Visual Analytics, PowerBI). Experience with big data technologies (e.g., Hadoop, HIVE, HDFS, HBase, MapReduce, Spark, Kafka, Sqoop). Master's degree in mathematics, statistics, computer science/engineering, or other related technical fields with equivalent practical experience. Experience constructing More ❯
in the schedule. Must be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. We have many More ❯
the IC. Expert proficiency in Python (or similar languages) and experience with data science libraries (TensorFlow, PyTorch, Pandas, NumPy). Strong experience with big data processing tools (e.g., Spark, Hadoop, AWS or Azure cloud platforms). Expertise in working with geospatial data formats (e.g., GeoTIFF, Shapefiles, WMS, WFS) and spatial libraries (e.g., GeoPandas, Rasterio, GDAL). Advance experience in More ❯
the IC. Expert proficiency in Python (or similar languages) and experience with data science libraries (TensorFlow, PyTorch, Pandas, NumPy). Strong experience with big data processing tools (e.g., Spark, Hadoop, AWS or Azure cloud platforms). Expertise in working with geospatial data formats (e.g., GeoTIFF, Shapefiles, WMS, WFS) and spatial libraries (e.g., GeoPandas, Rasterio, GDAL). Advance experience in More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Wyetech, LLC
skills. Understanding of AGILE software development methodologies and use of standard software development tool suites Desired Technical Skills Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
hands-on experience in programming and software development using Java, JavaScript, or Python. Demonstrated hands on experience working with PostgreSQL and Apache NiFi. Demonstrated hands-on experience working with Hadoop, Apache Spark and their related ecosystems. A candidate must be a US Citizen and requires an active/current TS/SCI with Polygraph clearance. Salary Range More ❯
and databases. Strong Linux skills with experience in hybrid cloud/on-prem architectures (AWS, C2S, OpenStack, etc.). Experience with big data technologies such as Kubernetes, Spark, Hive, Hadoop, Accumulo, and ElasticSearch. Experience with workflow and streaming tools such as Apache NiFi, Apache Airflow, or Kafka. Knowledge of common industry software tools, DevSecOps practices, and working with open More ❯
Telegraf and others. Knowledge and demonstrable hands-on experience with middleware technologies (Kafka, API gateways and others) and Data Engineering tools/frameworks like Apache Spark, Airflow, Flink and Hadoop ecosystems. Understanding of network technology fundamentals, Data Structures, scalable system design and ability to translate information in a structured manner for wider product and engineering teams to translate into More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Python (Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA and in various field offices throughout More ❯