data architecture, integration, governance frameworks, and privacy-enhancing technologies Experience with databases (SQL & NoSQL - Oracle, PostgreSQL, MongoDB), data warehousing, and ETL/ELT tools Familiarity with big data technologies (Hadoop, Spark, Kafka), cloud platforms (AWS, Azure, GCP), and API integrations Desirable: Data certifications (TOGAF, DAMA), government/foundational data experience, cloud-native platforms knowledge, AI/ML data requirements More ❯
Experience with DevSecOps, as it applies to data science Demonstrated experience with TensorFlow, Keras, or other Deep Learning tools and frameworks Preferred: Demonstrated experience in big data systems including Hadoop, Spark, Scala Data visualization skills in Tableau, Power BI, D3, ArcGIS, or similar Experience with Git, Bash, Unix commands Experience in Cloud analytics (AWS, Azure, or GCP) with tools More ❯
transformation and workload management Experience with development of REST APIs, access control, and auditing Experience with DevOps pipelines Experience using the following software/tools: Big Data tools: e.g. Hadoop, Spark, Kafka, ElasticSearch Data Lakes: e.g. Delta Lake, Apache Hudi, Apache Iceberg Distributed Data Warehouse Frontends: e.g. Apache Hive, Presto Data pipeline and workflow management tools: e.g Luigi, Airflow More ❯
Arlington, Virginia, United States Hybrid / WFH Options
STR
data structures, metadata, dependency and workload management Expert SQL knowledge and experience working with a variety of databases Experience using the following software/tools: Big Data tools: e.g. Hadoop, Spark, Kafka, ElasticSearch AWS: Athena, RDB, AWS credentials from Cloud Practitioner to Solutions Architect Data Lakes: e.g. Delta Lake, Apache Hudi, Apache Iceberg Distributed SQL interfaces: e.g. Apache Hive More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Elder Research, Inc
Provide technical leadership and contribute to all phases of the software development lifecyclefrom design to deployment. Required Skills/Experience: Hands-on experience with data engineering tools such as Hadoop, Cloudera, and Apache Spark. Proficiency with AWS services including EMR Studio. Familiarity with CI/CD pipelines, GitHub, and version control workflows. Experience working with or maintaining an Analytics More ❯
Portsmouth, Hampshire, England, United Kingdom Hybrid / WFH Options
Computappoint
with Trino/Starburst Enterprise/Galaxy administration and CLI operations Container Orchestration : Proven track record with Kubernetes/OpenShift in production environments Big Data Ecosystem : Strong background in Hadoop, Hive, Spark, and cloud platforms (AWS/Azure/GCP) Systems Architecture : Understanding of distributed systems, high availability, and fault-tolerant design Security Protocols : Experience with LDAP, Active Directory More ❯
experience with data architecture tools such as ER/Studio, Erwin, or Lucid Chart. Experience working in DataOps, Agile, and DevSecOps environments. Experience with Big Data applications such as Hadoop and associated applications. Target salary range: $160,001 - $200,000. The estimate displayed represents the typical salary range for this position based on experience and other factors. More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Gridiron IT Solutions
Python, SCALA, and/or UNIX shell scripting Expertise in machine learning techniques and statistical analysis Proficiency in SQL and NoSQL databases Experience with big data platforms such as Hadoop, Spark, and Kafka Cloud computing expertise across AWS, Azure, and other Experience in designing and implementing real-time data processing solutions Strong understanding of AI/ML applications in More ❯
Charlotte, North Carolina, United States Hybrid / WFH Options
City National Bank
years Advanced Java, R, SQL, Python coding Minimum 6+ years statistical Analysis, Machine Learning, Computer Science, Programming, Data Storytelling Minimum 6+ years big Data technologies such as Spark, AWS, Hadoop including traditional RDBMS such as Oracle and SQL Server. Minimum 6+ years of data mining (preferably in a data-intensive financial company) Additional Qualifications Proficient experience in machine learning More ❯
optimize data extraction, cleaning, processing, and analysis tasks using scripting languages and ETL pipelines. Process and analyze large-scale, structured and unstructured datasets using big data frameworks (e.g., Spark, Hadoop, or cloud-native equivalents). Develop, test, and validate predictive models and novel statistical methods to solve mission-critical problems. Independently conduct end-to-end data science/engineering More ❯
AWS (commercial, gov cloud, secure environments). You must be proficient with Kubernetes/Microservice-based architecture (e.g., OpenShift, EKS, Docker), managed services, and large-scale processing environments (e.g. Hadoop/Spark/MapReduce) Languages and Frameworks : Expertise in common object-oriented and scripting languages, with primary skills in Java, Python, and JavaScript (React, Angular). Experience with OpenLayers More ❯
AWS (commercial, gov cloud, secure environments). You must be proficient with Kubernetes/Microservice-based architecture (e.g., OpenShift, EKS, Docker), managed services, and large-scale processing environments (e.g. Hadoop/Spark/MapReduce) Languages and Frameworks : Expertise in common object-oriented and scripting languages, with primary skills in Java, Python, and JavaScript (React, Angular). Experience with OpenLayers More ❯
Working knowledge of modern Operating Systems (Linux/Windows), Networking, Servers/Mainframe, Secure Coding Practices, Development Environments (Java/.Net), Databases (DB2/Oracle/SQL Server/Hadoop/Databricks) Understanding of public cloud (AWS, Azure) services and platforms like API Gateways, Serverless, Virtual Private Networks, Elastic RDBMS, NoSQL, Key Managers, Load Balancers, Blob Stores, Indexing Services More ❯
and manage data analytic frameworks and pipelines using databases and tools such as (but not limited to) NoSQL, SQL, NiFi, Kafka, HDInsight, MongoDB, Cassandra, Neo4j, GraphDB, OrientDB, Spark, Flink, Hadoop, Kafka, Hive, and others. • Apply distributed systems concepts and principles such as consistency and availability, liveness and safety, durability, reliability, fault-tolerance, consensus algorithms. • Administrate cloud computing and CI More ❯
work experience). Proven experience with Trino/Starburst Enterprise/Galaxy administration/CLI. Implementation experience with container orchestration solutions (Kubernetes/OpenShift). Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP). Understanding of distributed system architecture, high availability, scalability, and fault tolerance. Familiarity with security authentication systems such as More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Advanced Resource Managers Limited
work experience). Proven experience with Trino/Starburst Enterprise/Galaxy administration/CLI. Implementation experience with container orchestration solutions (Kubernetes/OpenShift). Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP). Understanding of distributed system architecture, high availability, scalability, and fault tolerance. Familiarity with security authentication systems such as More ❯
Washington, Washington DC, United States Hybrid / WFH Options
BLN24
Ability to manage multiple projects and priorities effectively. Preferred Skills: Experience with cloud-based data lake solutions (e.g., AWS, Azure, Google Cloud). Familiarity with big data technologies (e.g., Hadoop, Spark). Knowledge of data governance and security best practices. Experience with ETL processes and tools. What BLN24 brings to the Game: BLN24 benefits are game changing. We like More ❯
Washington, Washington DC, United States Hybrid / WFH Options
BLN24
solving and analytical skills. Strong communication and collaboration abilities. Preferred Skills: Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). Familiarity with big data technologies (e.g., Hadoop, Spark). Knowledge of data governance and security best practices. Experience with ETL processes and tools. What BLN24 brings to the Game: BLN24 benefits are game changing. We like More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
well. Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
San Antonio, Texas, United States Hybrid / WFH Options
HII Mission Technologies
well. Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
Have: 2+ years of experience in the development of algorithms leveraging R, Python, or SQL/NoSQL 2+ years of experience with distributed data and computing tools, including MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL 2+ years of experience with ML, AI or NLP Experience with visualization packages, including Plotly, Seaborn, or ggplot2 Experience working with Advana More ❯
San Antonio, Texas, United States Hybrid / WFH Options
HII Mission Technologies
well. Flexibility is key to accommodate any schedules changes per the customer and team in place. Preferred Requirements Security+ certification is highly desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could possibly More ❯
computer vision), frameworks (e.g., TensorFlow, PyTorch), and cloud-native AI/ML services (AWS SageMaker, Azure ML, GCP AI Platform) Extensive experience with diverse data processing technologies (e.g., Spark, Hadoop, Kafka) and data warehousing/lake solutions Strong architecture skills in designing scalable, secure, and resilient cloud-based AI/ML solutions Excellent communication, presentation, and interpersonal skills, with More ❯
Software Development. Proficiency in data mining, data analysis, and data visualization. Experience with cloud computing platforms such as AWS, Azure, or GCP. Experience with big data technologies such as Hadoop, Spark, and Hive. Familiarity with DevOps practices and tools. Excellent leadership and management skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative environment. Strong problem More ❯
Deploying applications on Microsoft Azure, including Azure Storage and Azure Functions. Collaborating with cross-functional teams to implement secure web applications. Designing and maintaining big-data solutions such as Hadoop, Spark, or Kafka. Frontend development using HTML, CSS, JavaScript, and frameworks like React or Angular. Proficiency in programming languages such as PowerShell, Python, C++, C#, Java, TypeScript, Hive, R More ❯