Elasticsearch, Mongo DB, Node, and Docker Hands-on experience with AWS cloud US Citizenship and an active TS/SCI with Polygraph security clearance required Desired Qualifications: Experience with Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi a plus Experience with Sequalize and Umzug Familiarity with or ability to learn Git, Jenkins, FontAwesome, and LeftHook Bachelor's and/ More ❯
to collaborate with the team and do POCs and troubleshoot technical issues. Expert in QA Processes and tools - Able to write Test plan and Test strategies. Essential Skills Databricks Hadoop Tableau is a big plus Proficiency in Python. Experience in product development. Additional Skills & Qualifications Ability to work effectively within a team environment. Strong analytical and problem-solving skills. More ❯
controls o Experience on starting the front-end buildout from scratch by coordinating across multiple business and technology groups o Experience building complex single-page applications using Abinitio/Hadoop/Hive/Kafka/Oracle and modern MOM technologies o Experienced with Linux/Unix platform o Experience in SCMs like GIT; and tools like JIRA o Familiar More ❯
mathematics or equivalent quantitative field - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Excellent technical publications and material contributions to the CV/ML/AI field as related to image and video processing Our inclusive culture empowers Amazonians to More ❯
neural deep learning methods and machine learning PREFERRED QUALIFICATIONS - Experience with popular deep learning frameworks such as MxNet and Tensor Flow - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
risk assessments and remediation efforts Desired Skills & Experience: Experience with Data Management, data warehousing and large data concepts and tools Experience with Oracle, SQL Server, Alteryx, Teradata, DB2, Netezza, Hadoop, SAP eLedger, Rally, Visio, ETL tools and Data Management tools Knowledge of Global Technology, Enterprise Data Management & Enterprise Data Architecture programs, strategies and standards Finance, Regulatory Reporting and/ More ❯
demonstrated work experience with: o Distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc. o Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc. o Hadoop Distributed File System (HDFS) o Serialization such as JSON and/or BSON • 4 years of SWE experience may be substituted for a More ❯
responsibilities include, but are not limited to: • Maintaining, enhancing, and refactoring infrastructure operations code in Python to help automate tasks that ensure systems are properly up and running. • Supporting Hadoop clusters running on Cloudera by applying critical security patches, remediating vulnerabilities, and pro-actively planning to ensure clusters have sufficient storage. • Familiarity with Nifi, and the ability to partner More ❯
sensitive nature of the data we handle. Required - Proficiency in AWS services and tools related to data storage, processing, and analytics. - Strong experience with big data technologies such as Hadoop, Spark, or similar frameworks. - Active SC clearance is essential for this role. More ❯
fostering an environment where employees can thrive and make a difference. Key Responsibilities: Develop and maintain applications using distributed data storage and parallel computing technologies, including Oracle, Postgres, Cassandra, Hadoop, and Spark. Utilize back-end applications and data integration tools such as Java and Groovy. Create user-facing applications that support mission needs and enhance user experience. Work with More ❯
Collaborate within cross-functional Integrated Product Teams (IPTs) to drive system integration and ensure mission success • Research and implement distributed storage, routing and querying algorithms, leveraging technologies like HDFS, Hadoop, HBase, and Accumulo (BigTable) Desired Qualifications: • Strong background in network systems engineering, with a clear understanding of data routing, security, and optimization • Experience in the integration of COTS and More ❯
of prior relevant experience or a MS degree and 10+ years of prior relevant experience. Preferred Qualifications: • AWS certifications • Experience working with large scale data technologies (e.g. AWS Aurora, Hadoop, Elasticsearch, etc.) • Experience working within a low to high development/deployment environment Benefits and Perks: • Competitive salary and comprehensive benefits package • Commitment to diversity and inclusion in the More ❯
focus and attention to detail. • Knowledge of Spring MVC, Hibernate and Spring frameworks. • Understanding of Html 5, CSS3, Java Script, AJAX based programming and jQuery/Hibernate. • Experience in Hadoop, Cassandra, Big data technologies is a plus. SKILLS AND CERTIFICATIONS java REST spring hibernate Additional Information All your information will be kept confidential according to EEO guidelines. Direct Staffing More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the More ❯
often (in days) to receive an alert: Create Alert Supermicro is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are amongst the fastest growing company among the Silicon Valley Top 50 technology firms. Our unprecedented global More ❯
in applied research PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
CVS) Ability to build applications from source and troubleshoot compiling issues Experience with compilers such as (GNU, Intel, and AOCC) Storage Experience installation and tuning (ZFS, XFS, GPFS, Luster, Hadoop, Ceph, Object Storage) Shell scripting experience (Bash, Perl, Python) Virtualization Experience (VMWare, Xen, Hyper-V, KVM, etc.) Experience with x86 bootstrap process (BIOS, RAID, Fiber Channel, etc.) Experience with More ❯
Analytic exposure is a big plus. Java is a must, but these will strengthen your case: Data Analytic development experience Agile development experience Familiarity with/interest in ApacheHadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work on high visibility tasking Willingness/ability More ❯
Cloud systems based upon documented requirements. Directly contributes to all stages of designing, implementing and testing WebBased User Interfaces that expose data for Big-Data Cloud Based infrastructure using Hadoop Eco-System. Provides expertise in Cloud Technologies, Distributed Computing and how to best display data produced by these technologies. Cloud Designer implements Graphical Web-Based User Interface with usability … college or university is required. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: • Two (2) year of Web-Based applications that retrieves/ More ❯
/MOD or Enhanced DV Clearance. WE NEED THE PYTHON/DATA ENGINEER TO HAVE. Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER More ❯
/MOD or Enhanced DV Clearance. WE NEED THE PYTHON/DATA ENGINEER TO HAVE. Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER TPBN1_UKTJ More ❯
to take their products onto new platforms and into new customer bases. C urrently exploring options including RAD Studio, Visual Studio, Delphi, C#, C++, Client/Server, n-tier, Hadoop and SaaS. They require candidate with a strong computing background . You will be coding in Delphi and other languages. Any similar Object Oriented language (e.g. C++) will be More ❯
is open to all applicants, regardless of age. What you'll need to succeed Familiarity with UNIXKnowledge of CI toolsets Familiar with SQL, Oracle DB, Postgres, ActiveMQ, Zabbix, Ambari, Hadoop, Jira Confluence, BitBucket, ActiviBPM, Oracle SOA, Asure, SQL Server, Jenkins, Puppet and other cloud technologies. If successful, you will undergo BPSS clearance and must be eligible for SC Clearance. More ❯
Employment Type: Contract
Rate: £300.0 - £315.0 per day + c£315 per day (inside IR35)
role is open to all applicants, regardless of age. What you'll need to succeed Familiarity with UNIXKnowledge of CI toolsetsFamiliar with SQL, Oracle DB, Postgres, ActiveMQ, Zabbix, Ambari, Hadoop, Jira Confluence, BitBucket, ActiviBPM, Oracle SOA, Asure, SQL Server, Jenkins, Puppet and other cloud technologies.If successful, you will undergo BPSS clearance and must be eligible for SC Clearance. What More ❯