Experience employing spreadsheets for data manipulation and visualization. Developer (MCSD)/Microsoft Certified Solution Expert (MCSE)/Private Cloud/Certified Administrator for ApacheHadoop (CCAH) (Cloudera) Experience with programs and projects and other enterprise initiatives for efforts in the RDT&E phase of the acquisition life cycle Experience … Cloud Security (CCSK)/CompTIA A+/CompTIA Security+/EMC Data Science Associate (EMCDSA)/Cloudera Certified Data Scientist (CCDH)/Certified ApacheHadoop Developer (HCAHD) (Hortonworks)/Certified Information System Security Professional (CISSP)/Certified Cloud Professional (CCP) (Cloudera)/Microsoft Certified Professional Developer (MCPD)/Microsoft More ❯
Analytics Consultant, A2C Job ID: Amazon Web Services Korea LLC Are you a Data Analytics specialist? Do you have Data Warehousing and/or Hadoop experience? Do you like to solve the most complex and high scale data challenges in the world today? Would you like a career that … with AWS services - Hands on experience leading large-scale global data warehousing and analytics projects. - Experience using some of the following: Apache Spark/Hadoop ,Flume, Kinesis, Kafka, Oozie, Hue, Zookeeper, Ranger, Elasticsearch, Avro, Hive, Pig, Impala, Spark SQL, Presto, PostgreSQL, Amazon EMR,Amazon Redshift . Our inclusive culture More ❯
Python, Java, AWS Infrastructure, Linux, Kubernetes, Hadoop, CI/CD , Big Data Platform, Agile, JIRA, Confluence, Github, Gitlab, puppet, ansible, maven, virtualization, ovirt, proxmox, vmware, Shell/Bash scripting Due to federal contract requirements, United States citizenship and an active TS/SCI security clearance and polygraph are required … accredited college or university. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.). Understanding of agile More ❯
supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with ApacheHadoop and the Hadoop Distributed File System Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence More ❯
such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such More ❯
Solving: Proven ability to troubleshoot and solve complex problems. Nice to Haves: AWS certification or Security+ certification. Relevant IT discipline certifications (e.g., Java, .NET, Hadoop, Spring). Cloud Experience: Familiarity with cloud technologies such as Hadoop, HBase, or MongoDB. Independent and Collaborative Worker: Ability to function effectively both More ❯
technologies and analytics. Proficiency in C++, Java, Python, Shell Script; familiarity with R, Matlab, SAS Enterprise Miner. Knowledge of Elasticsearch and understanding of the Hadoop ecosystem. Experience working with large datasets and distributed computing tools such as Map/Reduce, Hadoop, Hive, Pig, etc. Advanced skills in Excel More ❯
to perform, amongst the Compute Team, within a massively parallel enterprise platform, built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable execution of data-intensive analytics on a managed infrastructure. The selected candidate will be a self-motivated Java developer who … variety of technologies depending on customer requirements. Required Skills: s Java programming for distributed systems, with experience in networking and multi-threading s ApacheHadoop s Apache Accumulo s Apache NiFi s Agile development experience s Well-grounded in Linux fundamentals and knowledge in at least one scripting language … s complexity is required s Bachelor's degree in Computer Science or related discipline from an accredited s college or university is required s Hadoop/Cloud Developer Certification The proposed salary range for this position in Maryland is 125,000 to 200,000. Final salary will be determined More ❯
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use More ❯
Big-Data Cloud systems to meet documented requirements. They contribute across all stages of back-end processing, analysis, and indexing, specializing in Cloud Computing, Hadoop Ecosystem, Java application implementation, Distributed Computing, Information Retrieval (IR), and Object-Oriented Design. Whether working independently or as part of a team, they review … Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. - Cloudera Certified Hadoop Developer certification may be substituted for one(1) year of Cloud experience. The following Cloud related experiences are required: - 2 years of Cloud and More ❯
with clients to understand their customer behaviour through deep data analysis and predictive modelling. You’ll leverage tools such as Python, PySpark, SQL , and Hadoop to build and deploy models that influence customer strategy across areas like propensity, churn, segmentation , and more. Key responsibilities include: Developing and deploying statistical … experience working with customer data and applying predictive modelling techniques Proficiency in SQL , Python/PySpark , and exposure to big data environments such as Hadoop Commercial experience in the FMCG or retail space is highly desirable Previous experience working in a consultancy or client-facing role is a plus More ❯
Cloud Security (CCSK)/CompTIA A+/CompTIA Security+/EMC Data Science Associate (EMCDSA)/Cloudera Certified Data Scientist (CCDH)/Certified ApacheHadoop Developer (HCAHD) (Hortonworks)/Certified Information System Security Professional (CISSP)/Certified Cloud Professional (CCP) (Cloudera)/Microsoft Certified Professional Developer (MCPD)/Microsoft … Experience employing spreadsheets for data manipulation and visualization Developer (MCSD)/Microsoft Certified Solution Expert (MCSE)/Private Cloud/Certified Administrator for ApacheHadoop (CCAH) (Cloudera) Experience with programs and projects and other enterprise initiatives for efforts in the RDT&E phase of the acquisition life cycle Experience More ❯
Retail and CPG, and Public Services. Consolidated revenues as of $13 billion. Job Description: ============= Spark - Must have Scala - Must Have Hive & SQL - Must Have Hadoop - Must Have Communication - Must Have Banking/Capital Markets Domain - Good to have Note: Candidate should know Scala/Python (Core) coding language. Pyspark … will not help here. Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data processing platform (Spark Streaming) experience would be an advantage. • Consistently More ❯
Retail and CPG, and Public Services. Consolidated revenues as of $13 billion. Job Description: ============= Spark - Must have Scala - Must Have Hive & SQL - Must Have Hadoop - Must Have Communication - Must Have Banking/Capital Markets Domain - Good to have Note: Candidate should know Scala/Python (Core) coding language. Pyspark … will not help here. Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data processing platform (Spark Streaming) experience would be an advantage. • Consistently More ❯
Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews … project Make recommendations for improving documentation and software development process standards Serve as a subject matter expert for Cloud Computing and corresponding technologies including Hadoop - assisting the software development team in designing, developing and testing Cloud Computing Systems Debug problems with Cloud based Distributed Computing Frameworks Manage multi-node … Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: Two (2) years of Cloud More ❯
industry experience maintaining a code base written in a high-level object-oriented language; formal studies or industry experience in distributed computing (e.g., MapReduce, Hadoop, AWS, DHTs, etc.); industry experience working with very large datasets; familiarity with parallel programming or parallel algorithms development; familiarity with machine learning concepts, data … Engineering); 2+ years' industry experience maintaining a code base written in a high-level object-oriented language; industry experience in distributed computing (e.g., MapReduce, Hadoop, AWS, DHTs, etc.); industry experience working with very large datasets; experience with parallel programming or parallel algorithms development; experience with machine learning concepts, data More ❯
enhancement, maintenance, testing, and problem diagnosis/resolution. You will work on a software development program proving software development engineering strategies for environments using Hadoop Distributed File System (HDFS), Map Reduce, and other related cloud technologies. You will provide set-up, configuration, and software installation for development, test, and … CloudBase/Acumulo, and Big Table; Convert existing algorithms or develop new algorithms to utilize the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig; Support operational systems utilizing the HDFS; Support the deployment of operational systems and applications in a cloud environment; Conduct scalability assessments More ❯
Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews … Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. (U) The following Cloud related experiences are required: Two (2) years of More ❯
Micro-services Container Platforms (OpenShift, Kubernetes, CRC, Docker) NoSQL DBs (Cassandra, MongoDB, HBase, Zookeeper, ArangoDB) Serialization libraries (Thrift, Protocol Buffers) Large scale data processing (Hadoop, Kafka) Dependency injection frameworks (Guice, Spring) Text search engines (Lucene, ElasticSearch) Splunk/Elastic CI/CD Build tools: Maven, Git, Jenkins Frameworks: Vert.x … generation messaging systems Backends for mobile messaging systems SIP or XMPP Soft real-time systems Experience doing performance tuning Big Data technologies, such as Hadoop, Kafka, and Cassandra, to build applications that contain petabytes of data and process millions of transactions per day Cloud computing, virtualization and containerization Continuous More ❯
experience or five (5) years programming experience may be substituted for a bachelor's degree. Clearance:Must possess a TS/SCI with polygraph Hadoop, Hive, and/or Pig Within the last three (3) years, a minimum of one (1) year experience with the Hadoop Distributed File More ❯
experience (including requirements, development, testing, troubleshooting). 6+ years' experience with high-level languages (Java, C, C++) and experience with Big Data technologies (HBase, Hadoop, MapReduce, etc.) Experience with cloud deployment and data-driven analytics and strong technical writing skills Experience with Big Data scalability (Amazon, Google, Facebook), designing … and developing automated analytics or Hadoop/Cloud Developer Certification is a plus. Working with foreign language processing, multimedia, and massive datasets and experience with Artificial Intelligence techniques are a plus. Clearance Requirements TS/SCI with FS Polygraph (no clearance upgrades or CCAs). Please note, you MUST More ❯
at the Sponsor's site. The ideal candidate will possess expertise in Linux, cloud, and cluster computing environments, with a strong focus on the Hadoop ecosystem and distributed technologies like Kafka and Presto. This role involves working closely with cross-functional teams to ingest, assess, and strategically utilize new … network operations missions. Deliver engineering support for data triage and assessments in diverse environments. Leverage expertise in Linux, cloud computing, and cluster technologies, including Hadoop, Kafka, and Presto. Collaborate with contractors and multidisciplinary teams to ingest and analyze data, identifying new operational opportunities. Utilize extensive experience with cloud environments … both domestic and foreign stakeholders. Qualifications Required Qualifications: Demonstrated experience with Hadoop. Demonstrated experience understanding large distributed data systems, cloud infrastructure, and network architecture (Hadoop, Kafka, HBase, or Presto). Demonstrated experience with cloud, data management, and development environments: Specifically, AWS, HDFS, Cloudera, and SQL (as well as Spark More ❯
s Degree in Engineering, Systems Engineering, Computer Science, or Mathematics is highly desired and will be considered equivalent to two (2) years of experience. Hadoop/Cloud System Administrator Certification or comparable Cloud System/Service Certification is desired. Desired : Shall have experience diagnosing and troubleshooting large-scale cloud … computing systems, including familiarity with distributed systems e.g. Hadoop, CASSANDRA, SCALITY, SWIFT, Gluster, Lustre, GPFS, Amazon S3, or another other comparable technology for big data management or High-performance computing Demonstrated ability to work independently on complex tasks, and show a willingness to educate and train more junior technical More ❯
scalability. Collaborate with analysts and QA engineers in an Agile, behavior-driven development (BDD) setup . Optimize and maintain data storage solutions including Oracle, Hadoop, Snowflake, and MongoDB. Utilize CI/CD tools like Jenkins, Bamboo, and SonarQube to improve software reliability. About You: 5+ years of hands-on … Web Services, authentication techniques, and security best practices . Strong troubleshooting skills, proactive mindset, and excellent communication skills. Bonus: Experience with Databricks, NiFi, and Hadoop is a plus. *For this position, you need to be based within commutable distance to Dublin and hold an EU/UK passport ot More ❯
with other contractor resources to ingest new data and use it to inform new operations or opportunities. • Linux, cluster, and cloud computing, especially the Hadoop ecosystem and related distributed technologies such as Kafka and Presto. • Cloud, data management, and development environments: Specifically, AWS, HDFS, Cloudera, and SQL (as well … data triage and assessments. Education, Experience and Qualifications: • Demonstrated experience with Hadoop. • Demonstrated experience understanding large distributed data systems, cloud infrastructure, and network architecture (Hadoop, Kafka, HBase, or Presto). • Demonstrated experience with cloud, data management, and development environments: Specifically, AWS, HDFS, Cloudera, and SQL (as well as Spark More ❯