Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
Job Number 690222BR Description:THE WORK This senior role fosters collaboration with other senior engineers for the development of advanced data analytics solutions and agile development projects in support of a high-visibility mission. This position involves providing technical leadership More ❯
successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated Big Data Administrators (ApacheHadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company Big Data Administrator (ApacheHadoop/Cloudera) (all genders … Aufgaben Administrate, monitor and optimize our Big Data environment based on ApacheHadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of Hadoop services for business use cases 3rd level … of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in ApacheHadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat) Expertise in building and operating Big Data environments based on ApacheHadoop clusters Wir bieten More ❯
successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated Big Data Administrators (ApacheHadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company Big Data Administrator (ApacheHadoop/Cloudera) (all genders … Aufgaben Administrate, monitor and optimize our Big Data environment based on ApacheHadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of Hadoop services for business use cases 3rd level … of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in ApacheHadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat) Expertise in building and operating Big Data environments based on ApacheHadoop clusters Wir bieten More ❯
successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated Big Data Administrators (ApacheHadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company Big Data Administrator (ApacheHadoop/Cloudera) (all genders … Aufgaben Administrate, monitor and optimize our Big Data environment based on ApacheHadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of Hadoop services for business use cases 3rd level … of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in ApacheHadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat) Expertise in building and operating Big Data environments based on ApacheHadoop clusters Wir bieten More ❯
successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated Big Data Administrators (ApacheHadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company Big Data Administrator (ApacheHadoop/Cloudera) (all genders … Aufgaben Administrate, monitor and optimize our Big Data environment based on ApacheHadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of Hadoop services for business use cases 3rd level … of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in ApacheHadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat) Expertise in building and operating Big Data environments based on ApacheHadoop clusters Wir bieten More ❯
successful IT company? Then take advantage of your chance: JOB WORLD GmbH is a partner of the leading IT companies in Austria. We offer dedicated Big Data Administrators (ApacheHadoop/Cloudera) top access to interesting IT jobs. We are looking for DIRECT EMPLOYMENT for a renowned IT company Big Data Administrator (ApacheHadoop/Cloudera) (all genders … Aufgaben Administrate, monitor and optimize our Big Data environment based on ApacheHadoop from Cloudera (AWS-Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates/upgrades Advise our Data Engineers and Data Scientists on the selection of Hadoop services for business use cases 3rd level … of Cloud Native Services Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Technical education (computer science studies, etc.) At least 5 years experience in ApacheHadoop Cloudera environments. Migration to AWS Experience in system administration of Linux systems (RedHat) Expertise in building and operating Big Data environments based on ApacheHadoop clusters Wir bieten More ❯
with ServiceNow and Splunk Experience supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with ApacheHadoop and the Hadoop Distributed File System Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence More ❯
scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such as JSON and/or More ❯
software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution. You will work on a software development program proving software development engineering strategies for environments using Hadoop Distributed File System (HDFS), Map Reduce, and other related cloud technologies. You will provide set-up, configuration, and software installation for development, test, and production systems. Interface directly with … NoSQL) products such as Hbase, CloudBase/Acumulo, and Big Table; Convert existing algorithms or develop new algorithms to utilize the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig; Support operational systems utilizing the HDFS; Support the deployment of operational systems and applications in a cloud environment; Conduct scalability assessments of Cloud-related algorithms, applications More ❯
with Linux-based systems. Problem-Solving: Proven ability to troubleshoot and solve complex problems. Nice to Haves: AWS certification or Security+ certification. Relevant IT discipline certifications (e.g., Java, .NET, Hadoop, Spring). Cloud Experience: Familiarity with cloud technologies such as Hadoop, HBase, or MongoDB. Independent and Collaborative Worker: Ability to function effectively both independently and in team settings. More ❯
business management information systems) based upon documented requirements for the Data Transport System (DTS) • DTS products include but are not limited to: Cloud storage areas: Apache Accumulo (Apache Zookeeper, ApacheHadoop), Oracle DMBS Real time streaming: Storm Distributed in-memory data Cache/Storage: Redis, Graph Compute engine/Query Interface apache Tinkerpop/Gremlin. Rules Engine: JBoss Drools, Apache Camel … and other federal partners • The DTS portfolio encompasses transport streams, messages and files with content size ranging from bytes to Terabytes • Candidates should have experience writing analytics using ApacheHadoop, HDFS, and MapReduce • Experience processing large data sets or high-volume data ingest is a plus • Experience monitoring, maintaining and troubleshooting Apache Accumulo, ApacheHadoop, and Apache Zookeeper More ❯
a Cloud Software Engineer 3 to perform, amongst the Compute Team, within a massively parallel enterprise platform, built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable execution of data-intensive analytics on a managed infrastructure. The selected candidate will be a self-motivated Java developer who proactively completes tasks with a … will be exposed to a variety of technologies depending on customer requirements. Required Skills: s Java programming for distributed systems, with experience in networking and multi-threading s ApacheHadoop s Apache Accumulo s Apache NiFi s Agile development experience s Well-grounded in Linux fundamentals and knowledge in at least one scripting language (e.g., s Python, Ruby, Perl … of similar scope, type, and s complexity is required s Bachelor's degree in Computer Science or related discipline from an accredited s college or university is required s Hadoop/Cloud Developer Certification The proposed salary range for this position in Maryland is 125,000 to 200,000. Final salary will be determined based on various factors. Our More ❯
Data Solutions in Mission-Critical areas. WE NEED THE BIG DATA ENGINEER TO HAVE.... Current DV clearance - Standard or Enhanced Must have experience with big data tools such as Hadoop, Cloudera or Elasticsearch Experience with Palantir Foundry is preferred but not essential Experience working in an Agile Scrum environment Experience in design, development, test and integration of software IT …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH More ❯
software or modify existing software to add features, implement complex algorithms, and ensure adherence to timing, resource, or interface constraints, including demonstrated experience in MapReduce models and tools like Hadoop, Hive, and Pig. Must be capable of debugging and modifying existing software to correct defects, adapt to new hardware, or improve performance, ensuring integration with Hadoop Distributed File More ❯
About The Role Looking for a Cloud System Administrator to work in a cloud platform environment, to built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable the execution of data-intensive analytics on a managed infrastructure. This position is on the Operations Team that ensures day-to-day operations stability, provides … are expected to provide Tier 1 through 3 support. The candidate should have a strong background in troubleshooting operational issues in a Linux environment. Additional knowledge of Docker, Kubernetes, Hadoop and scripting experience such as python and bash is beneficial. Qualifications Education: Seven (7) years experience is required. Bachelor's Degree in Engineering, Systems Engineering, Computer Science, Mathematics is … highly desired and will be considered equivalent to two (2) years of experience. Preferred experience: Prometheus, JIRA, Hadoop Distributed File System (HDFS), Virtualization, Salt/Ansible, Grafana, Openstack and AWS. One of the following certifications required: DOD 8570 compliant. o AWS Certified Developer-Associate o AWS DevOps Engineer Professional o Certified Kubernetes Application Developer (CKAD) o Elastic Certified Engineer More ❯
that support highly distributed, massively parallel computation needssuch as Hbase, CloudBase/Acumulo, Big Table. Shall have demonstrated work experience with the Map Reduce programming model and technologiessuch as Hadoop, Hive, Pig. Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS). Shall have demonstrated work experience with Serialization such as JSON and/or More ❯
us the software solutions of tomorrow! Learn more here: ADMIRAL Technologies - A clear victory for your future! Aufgaben Administrate, monitor and optimize our Big Data environments based on ApacheHadoop (AWS Cloud) Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables IaaC deployment via Terraform Plan and execute updates and upgrades Advise our Data Engineers and … Data Scientists on the selection of Hadoop services for business use cases 3rd level support through troubleshooting and error analysis Support workload optimization (eg. query optimization) Evaluation and implementation of … Cloud Native Services Profil Technical education (computer science studies, etc.) Experience in system administration of Linux systems (RedHat) Expertise in building and operating Big Data environments based on ApacheHadoop clusters Interest in continuously learning and improving one's skillset Wir bieten State-of-the-Art Technologies We understand the usage of State-of-the-Art-Technologies as part More ❯
About The Role Join Peraton's dynamic Operations Team supporting a cutting-edge cloud environment platform built primarily in Java, leveraging Free and Open-Source Software technologies including Kubernetes, Hadoop, and Accumulo. This platform enables the execution of large-scale, data-intensive analytics on a managed infrastructure designed to deliver high performance, scalability, and reliability. Key Responsibilities: Operate and … Tier 1 through Tier 3 technical support, including incident management, troubleshooting, root cause analysis, and problem resolution. Collaborate with cross-functional teams to deploy, monitor, and manage Kubernetes clusters, Hadoop data lakes, and Accumulo databases. Perform daily operational tasks such as system health checks, log analysis, patch management, and performance tuning within a Linux environment. Act as a primary … to understand and support software running in such environments. Hands-on experience with Kubernetes orchestration, container management, and cloud-native application deployment. Knowledge of big data technologies such as Hadoop and Accumulo, including cluster management and data operations. Strong customer service orientation with the ability to communicate complex technical information effectively to diverse stakeholders. Self-motivated and capable of More ❯
solutions. About The Role Looking for a Reliability Engineer to work in a cloud platform environment, to built with Java on Free and Open-Source Software products including Kubernetes, Hadoop and Accumulo, to enable the execution of data-intensive analytics on a managed infrastructure. This position is on the Operations Team that ensures day-to-day operations stability, provides … are expected to provide Tier 1 through 3 support. The candidate should have a strong background in troubleshooting operational issues in a Linux environment. Additional knowledge of Docker, Kubernetes, Hadoop and scripting experience such as python and bash is beneficial. Qualifications Eleven (11) years of experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation … Administrator-Associate Certified Kubernetes Administrator (CKAD) Elastic Certified Engineer Elastic Certified Obervability Engineer Active TS/SCI security clearance with a current polygraph is required. Preferred Qualifications: Prometheus, JIRA, Hadoop Distributed File System (HDFS), Virtualization, Salt/Ansible, Grafana, Openstack and AWS. Peraton offers enhanced benefits to employees working on this critical National Security program, which include heavily subsidized More ❯
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … for a bachelor's degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: • Two (2) years of Cloud and/or Distributed Computing More ❯
Puppet and SALT Education: Bachelor's Degree in Engineering, Systems Engineering, Computer Science, or Mathematics is highly desired and will be considered equivalent to two (2) years of experience. Hadoop/Cloud System Administrator Certification or comparable Cloud System/Service Certification is desired. Desired : Shall have experience diagnosing and troubleshooting large-scale cloud computing systems, including familiarity with … distributed systems e.g. Hadoop, CASSANDRA, SCALITY, SWIFT, Gluster, Lustre, GPFS, Amazon S3, or another other comparable technology for big data management or High-performance computing Demonstrated ability to work independently on complex tasks, and show a willingness to educate and train more junior technical resources. Demonstrated ability to plan, communicate, lead and oversee complex technical tasks requiring interaction with More ❯
Cloud systems based upon documented requirements. Directly contributes to all stages of designing, implementing and testing WebBased User Interfaces that expose data for Big-Data Cloud Based infrastructure using Hadoop Eco-System. Provides expertise in Cloud Technologies, Distributed Computing and how to best display data produced by these technologies. Cloud Designer implements Graphical Web-Based User Interface with usability … college or university is required. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: • Two (2) year of Web-Based applications that retrieves/ More ❯
dynamic environments, enjoys solving complex technical problems, and is passionate about contributing to real-world cybersecurity missions. What You'll Do: You'll work hands-on with large-scale Hadoop and Accumulo clusters, helping maintain the stability, performance, and security of our cloud systems. Your responsibilities will include: Monitoring and maintaining system health across distributed environments Troubleshooting hardware, software … You Are: You're a proactive, skilled Linux system administrator with a strong background in: Operational troubleshooting of cloud-based systems and large-scale distributed clusters Tools such as Hadoop, Accumulo, and experience managing cloud infrastructure Scripting and automation using Bash, Python, or other scripting languages Configuration management with Puppet (or similar tools)You understand that requirements can shift More ❯