about business, product, and technical challenges in an enterprise environment - Extensive hands-on experience with data platform technologies, including at least three of: Spark, Hadoop ecosystem, orchestration frameworks, MPP databases, NoSQL, streaming technologies, data catalogs, BI and visualization tools - Proficiency in at least one programming language (e.g., Python, Java More ❯
level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
visualization tools (e.g., Matplotlib, Seaborn, Tableau). Ability to work independently and lead projects from inception to deployment. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. MSc or PhD in Computer Science, Data Science, or related field is preferred. Don More ❯
technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Understanding of Maven or Gradle About the Team J.P. Morgan is a global leader in financial services, providing strategic advice More ❯
on full-stack programming experience using Java, Javascript, Scala/Python Experience working with big-data stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model More ❯
Columbia, Maryland, United States Hybrid / WFH Options
SRC
and ability to travel up to 25% of the time to customer sites located in Hawaii. Preferred Requirements Experience with big data technologies like Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers (EKS, Diode), CI/CD, and Terraform. Willingness to perform some on More ❯
warehousing and building ETL pipelines Experience with SQL Experience mentoring team members on best practices PREFERRED QUALIFICATIONS Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience operating large data warehouses Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
High School Diploma or equivalent and 9 years relevant experience. Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs Experience in RESTful web services More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
the schedule. Preferred Requirements Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
San Antonio, Texas, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
San Antonio, Texas, United States Hybrid / WFH Options
HII Mission Technologies
CCSP is highly desired) Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
directed by the customer. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
Alexandria, Virginia, United States Hybrid / WFH Options
Metronome LLC
willing/able to help open/close the workspace during regular business hours as needed Desired Skills Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus All candidates will be required to be More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
Framework (RMF) process Proficiency in system design and meticulous documentation Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs. Experience in RESTful web services More ❯
science, data analytics, software engineering, or related technical field. A minimum of five years of hands-on experience with Big Data applications, such as Hadoop, and associated applications (e.g., administration, CM, monitoring, performance tuning, etc.) PREFERRED QUALIFICATIONS Experience working in a government mission Data Exploitation environment (e.g., acquiring data More ❯
Linux and Windows operating systems: security, configuration, and management Database design, setup, and administration (DBA) experience with Sybase, Oracle, or UDB Big data systems: Hadoop, Snowflake, NoSQL, HBase, HDFS, MapReduce Web and Mobile technologies, digital workflow tools Site reliability engineering and runtime operational tools (agent-based technologies) and processes More ❯
the legal domain. Ability to communicate with multiple stakeholders, including non-technical legal subject matter experts. Experience with big data technologies such as Spark, Hadoop, or similar. Experience conducting world-leading research, e.g. by contributions to publications at leading ML venues. Previous experience working on large-scale data processing More ❯
with tools like Ansible, Terraform, Docker, Kafka, Nexus Experience with observability platforms: InfluxDB, Prometheus, ELK, Jaeger, Grafana, Nagios, Zabbix Familiarity with Big Data tools: Hadoop, HDFS, Spark, HBase Ability to write code in Go, Python, Bash, or Perl for automation. Work Experience 5-7+ years of proven experience More ❯
engineering or related roles. Advanced proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Extensive experience with big data technologies (e.g., Hadoop, Spark). Deep understanding of cloud platforms (AWS, GCP, Azure) and their data services. 2+ Experience in Python Development Strong understanding and practical experience … and/or testing API Gateway tools and Rest APIs Experience or expertise configuring an LDAP client to connect to IPA Experience with ApacheHadoop and ETL Who we are: Reinventing Geospatial, Inc. (RGi) is a fast-paced small business that has the environment and culture of a start More ❯
Lexington, Massachusetts, United States Hybrid / WFH Options
Equiliem
environments, such as Pandas, TensorFlow, and Jupyter Notebook. • Broad knowledge of the general features, capabilities, and trade-offs of common data warehouse (e.g. ApacheHadoop); workflow orchestration (e.g. Apache Beam); data extract, transform and load (ETL); and stream processing (e.g. Kafka) technologies. Hands-on experience with several of these More ❯
Active TS/SCI clearance with polygraph required • Demonstrated experience providing data support to data engineering and ETL teams • Cloud Services Environment - AWS, Snowflake, Hadoop, Apache or elastic search • Experience programming to manipulate and analyze large datasets and automate tasks • Knowledge of data management fundamentals, data storage principles, database More ❯
automation & configuration management Ansible (plus Puppet, Saltstack), Terraform, CloudFormation NodeJS, REACT/MaterialUI (plus Angular), Python, JavaScript Big data processing and analysis, e.g. ApacheHadoop (CDH), Apache Spark RedHat Enterprise Linux, CentOS, Debian or Ubuntu Java 8, Spring framework (preferably Spring boot), AMQP - RabbitMQ Open source technologies Experience of More ❯
work in the United States are encouraged to apply. At least 4 years of Information Technology experience At least 3+ years of experience in Hadoop platform 3+ years' experience in PySpark development 4 years minimum experience using Python 4+ years of Azure or AWS cloud platform experience At least More ❯