protection directions. At least one successful cross-system full-link data fortification implementation experience. Familiarity with at least one language, such as Java, Go, or Python, and knowledge of Hadoop/Spark ecosystems in the big data domain. Preference for candidates with multinational corporate experience in data security and over 3 years of experience in international data security, encryption More ❯
NN, Naive Bayes, Random Forests, etc. - Experience with common data science toolkits, such as Python - Proficiency in using query languages such as SQL on a big data platform e.g. Hadoop, Hive - Good applied statistics skills, such as distributions, statistical testing, regression, etc. - Good scripting and programming skills It would be desirable for the successful candidate to come from a More ❯
Must Have Hive/SQL - Must Have Job Description : Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies , real time data processing platform(Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written More ❯
Must Have Hive/SQL - Must Have Job Description : Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies , real time data processing platform(Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written More ❯
scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such as JSON and/or More ❯
a relevant discipline such as Computer Science, Statistics, Applied Mathematics, or Engineering - Strong experience with Python and R - A strong understanding of a number of the tools across the Hadoop ecosystem such as Spark, Hive, Impala & Pig - An expertise in at least one specific data science area such as text mining, recommender systems, pattern recognition or regression models - Previous More ❯
excellent project management skills and technical experience to ensure successful delivery of complex data projects on time and within budget. Responsibilities: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Quant Capital
multi-threaded Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you More ❯
degree or above in computer science, machine learning, engineering, or related fields - PhD in computer science, computer engineering, or related field - Experience with large scale distributed systems such as Hadoop, Spark etc. - fundamentals on a broad set of ML approaches and techniques - fundamentals in problem solving and algorithm design. - interest in learning, researching, and creating new technologies with commercial More ❯
modelling and techniques Preferred Qualifications Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
programming languages such as SQL, Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the Customer's production landscape, multi More ❯
programming languages such as SQL, Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the Customer's production landscape, multi More ❯
solutions to ensure design constraints are met by the software team Ability to initiate and implement ideas to solve business problems Preferred qualifications, capabilities, and skills Knowledge of HDFS, Hadoop, Databricks Knowledge of Airflow, Control-M Familiarity with container and container orchestration such as ECS, Kubernetes, and Docker Familiarity with troubleshooting common networking technologies and issues About Us J.P. More ❯
clearly and create trust with stakeholders. Preferred qualifications, capabilities, and skills Experience designing/implementing pipelines using DAGs (e.g. Kubeflow, DVC, Ray) Experience of big data technologies (e.g. Spark, Hadoop) Have constructed batch and streaming microservices exposed as REST/gRPC endpoints Familiarity with GraphQL About Us J.P. Morgan is a global leader in financial services, providing strategic advice More ❯
/Portals, Workflow, Enterprise Service Bus(ESB), Web Services, Message brokers, Relational Databases. Exposure to common industry platforms and programming languages - Appian BPM, IBM WebSphere, Mule, LAMP/JBOSS, HADOOP, Java, Microsoft/.Net is preferred. Masters degree in Computer Science or other related field Experience working with Appian Sites. Java developer with Maven builds experience is a strong More ❯
year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Two (2) years of experience with programs utilizing Big Data Cloud technologies CERTIFICATIONS: Cloudera Certified Hadoop Developer, CompTIA Cloud Plus (Cloud+), AWS, and Microsoft Azure foundational or fundamental level certification may be substituted for one (1) year of Cloud experience. AWS and Microsoft Azure Associate, Expert More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … for a bachelor's degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: • Two (2) years of Cloud and/or Distributed Computing More ❯
You will play a key role within a small data science team. Client is looking for hands on experience developing solutions for complex data science problems using Python, R, Hadoop, and Greenplum (or other Massively Parallel Processing solutions). REQUIRED SKILLS: Bachelor's Degree in a quantitative or technical field of study, such as Statistics, Mathematics, Computer Science, or More ❯
statistical knowledge Excellent communication and decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large data sets before (billions of More ❯
A Bachelor's degree in Engineering, Systems Engineering, Computer Science, or Mathematics is highly desired and may be substituted for two (2) years of experience • Certifications: Must possess a Hadoop/Cloud System Administrator Certification or equivalent Cloud System/Service certification that is DoD 8570 compliant, such as: AWS Certified Developer - Associate AWS DevOps Engineer - Professional Certified Kubernetes More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tec Partners
Ansible . Comfortable working with cloud platforms (e.g., AWS, Azure, GCP) and container orchestration tools like Kubernetes . Excellent troubleshooting skills and a collaborative approach. Bonus: Experience with Cloudera , Hadoop , CDSW , or CML is a plus. What's on Offer: Flexible hybrid working arrangements Core benefits including private healthcare, dental, life assurance and pension Optional benefits including health cash More ❯
with rich textual content Experience of Java programming can independently prototype solutions to problems Experience with Recommender System, NLP and Machine Learning libraries Experience with big data technologies (e.g. Hadoop, MapReduce, Cascading, Scalding, Scala) is desirable but not required Unix skills Experience with start-up and R&D environments Strong presentation skills in communicating with experts and novices Language More ❯
Big Data projects developing in Kotlin or any other JVM language. Experience with any of the following: Oracle, Kubernetes/Openshift, Redis, Memcached. Experience with Big Data technologies like Hadoop, Cassandra, Hive. We offer Opportunity to work on cutting-edge projects. Work with a highly motivated and dedicated team. Competitive daily rate. Benefits package including medical insurance and sports More ❯
Elasticsearch, Mongo DB, Node, and Docker Hands-on experience with AWS cloud US Citizenship and an active TS/SCI with Polygraph security clearance required Desired Qualifications: Experience with Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi a plus Experience with Sequalize and Umzug Familiarity with or ability to learn Git, Jenkins, FontAwesome, and LeftHook Bachelor's and/ More ❯