System. • Agile Development: Previous exposure to Agile methodologies or similar technologies (training provided if necessary). • Advanced Technologies: Knowledge of big data technologies (HBase, Hadoop), machine learning frameworks (Spark), or orbit dynamics is of interest. Why CGI Secure Space Systems? Join a team that's at the forefront of More ❯
Alexandria, Virginia, United States Hybrid / WFH Options
Metronome LLC
changes as needed. Desired Skills Configuration management tools experience a plus (Puppet, Ansible, Chef, etc.) Experience with Big Data technologies is a huge plus (Hadoop, Kafka, Accumulo, Storm, Hortonworks, Cloudera). Experience with Cloud technologies a plus (AWS, Azure) Higher level clearance is desired, but not required. Can consider More ❯
in system engineering/architecture. Ten (10) years of experience working with products that support highly distributed, massively parallel computation needs such as Hbase, Hadoop, Acumulo, Big Table, Cassandra, Scality et cetera. At least seven (7) years of experience writing software scripts using scripting languages such as Perl, Python More ❯
Desired Skills & Experience: Experience with Data Management, data warehousing, and large data concepts and tools Experience with Oracle, SQL Server, Alteryx, Teradata, DB2, Netezza, Hadoop, SAP eLedger, Rally, Visio, ETL tools, and Data Management tools Knowledge of Global Technology, Enterprise Data Management & Enterprise Data Architecture programs, strategies, and standards More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
architecture for technology implementations leveraging different architectural views. Exposure to common industry platforms and programming languages - Appian BPM, IBM WebSphere, Mule, LAMP/JBOSS, HADOOP, Java, Microsoft/.Net is preferred. 3+ years of experience in Enterprise Application integration (SOA, ESB) and n-tier client-server architectures preferred Experience More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. More ❯
limited to: • Maintaining, enhancing, and refactoring infrastructure operations code in Python to help automate tasks that ensure systems are properly up and running. • Supporting Hadoop clusters running on Cloudera by applying critical security patches, remediating vulnerabilities, and pro-actively planning to ensure clusters have sufficient storage. • Familiarity with Nifi More ❯
issues. Looking for developers with experience in web development, middleware development, or database development and the following technologies (but not limited to): Java, J2EE, Hadoop, MapReduce, PIG, NoSQL, ANT, AXIS, EJB, Javascript, JAXP, JDBC ,Oracle, PL/SQL, Unix Scripts, XML, HTML, Struts, Servlets, Eclipse IDE, Weblogic, Web services More ❯
virtualization • Experience with NAGIOS, Puppet, TENTAKEL, or IPMI • Experience with SECSCN, WASSP, NESSUS • Experience implementing automated patch management of Linux and Windows • Experience with HADOOP System Administration • Experience with high performance Linux cluster • Experience with source code control system • Experience with AWS API command line • Experience with Puppet/ More ❯
can thrive and make a difference. Key Responsibilities: Develop and maintain applications using distributed data storage and parallel computing technologies, including Oracle, Postgres, Cassandra, Hadoop, and Spark. Utilize back-end applications and data integration tools such as Java and Groovy. Create user-facing applications that support mission needs and More ❯
Demonstrated experience developing, implementing, and maintaining security support structures utilizing Linux and Windows technologies. • Demonstrated experience developing, implementing, and maintaining security support structures for HADOOP Ecosystems. • Demonstrated experience assisting with the design, engineering, and development of highly secure systems built upon Amazon Web Services. • Demonstrated experience auditing and monitoring More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting More ❯
Knowledge of Spring MVC, Hibernate and Spring frameworks. Understanding of Html 5, CSS3, Java Script, AJAX based programming and jQuery/Hibernate. Experience in Hadoop, Cassandra, Big data technologies is a plus. SKILLS AND CERTIFICATIONS Java REST Spring Hibernate Additional Information All your information will be kept confidential according More ❯
machine learning. Familiarity with modeling tools like R, scikit-learn, Spark MLLib, MxNet, TensorFlow, numpy, scipy. Experience with large-scale distributed systems such as Hadoop, Spark. Amazon is an equal opportunity employer. For accommodations during the application process, visit More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - PhD in math/statistics/engineering or other equivalent quantitative discipline - Experience with conducting research in a corporate setting - Experience in More ❯
issues. What We Value Experience with monitoring systems using tools like Prometheus and writing health checks Interest in learning and managing technologies like Spark, Hadoop, Elasticsearch, and Cassandra Familiarity with deploying GPUs Moderate experience with TCP/IP networking Ability to work independently with minimal supervision Ability to travel More ❯
the front-end buildout from scratch by coordinating across multiple business and technology groups o Experience building complex single-page applications using Abinitio/Hadoop/Hive/Kafka/Oracle and modern MOM technologies o Experienced with Linux/Unix platform o Experience in SCMs like GIT; and More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. PhD in math/statistics/engineering or other equivalent quantitative discipline Experience with conducting research in a corporate setting Experience in More ❯
from source and troubleshoot compiling issues Experience with compilers such as (GNU, Intel, and AOCC) Storage Experience installation and tuning (ZFS, XFS, GPFS, Luster, Hadoop, Ceph, Object Storage) Shell scripting experience (Bash, Perl, Python) Virtualization Experience (VMWare, Xen, Hyper-V, KVM, etc.) Experience with x86 bootstrap process (BIOS, RAID More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
in production-grade code, with a focus on scalability and reliability. Experience with large-scale data analysis, manipulation, and distributed computing platforms (e.g., Hive, Hadoop). Familiarity with advanced machine learning methods, including neural networks, reinforcement learning, and other cutting-edge Gen AI approaches. Skilled in API development and More ❯
Employment Type: Permanent, Part Time, Work From Home
Honeywell Connected Plant, Emerson Plantweb/AMS, GE/Meridum APM, Aveva, Bentley, and OSIsoft PI Familiarity with relevant technology, such as Big Data (Hadoop, Spark, Hive, BigQuery); Data Warehouses; Business Intelligence; and Machine Learning Savvy at helping customers create business cases with quantified ROI to justify new investments More ❯
Degree Required technical and professional expertise Design, develop, and maintain Java-based applications for processing and analyzing large datasets, utilizing frameworks such as ApacheHadoop, Spark, and Kafka. Collaborate with cross-functional teams to define, design, and ship data-intensive features and services. Optimize existing data processing pipelines for … Information Technology, or a related field, or equivalent experience. Experience in Big Data Java development. In-depth knowledge of Big Data frameworks, such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing More ❯
Joining the BlueHalo, an AV Company, team means immersing yourself in a dynamic environment, working alongside the brightest minds in technology through some of the toughest challenges facing our nation today. We are spearheading the future of global defense, with More ❯
Joining the BlueHalo, an AV Company, team means immersing yourself in a dynamic environment, working alongside the brightest minds in technology through some of the toughest challenges facing our nation today. We are spearheading the future of global defense, with More ❯
Press Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: The job you're considering The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen More ❯