a relevant discipline such as Computer Science, Statistics, Applied Mathematics, or Engineering - Strong experience with Python and R - A strong understanding of a number of the tools across the Hadoop ecosystem such as Spark, Hive, Impala & Pig - An expertise in at least one specific data science area such as text mining, recommender systems, pattern recognition or regression models - Previous More ❯
degree or above in computer science, machine learning, engineering, or related fields - PhD in computer science, computer engineering, or related field - Experience with large scale distributed systems such as Hadoop, Spark etc. - fundamentals on a broad set of ML approaches and techniques - fundamentals in problem solving and algorithm design. - interest in learning, researching, and creating new technologies with commercial More ❯
modelling and techniques Preferred Qualifications Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
and breadth knowledge in machine learning, data mining and statistics. Traffic quality systems process billions of ad-impressions and clicks per day, by leveraging leading open source technologies like Hadoop, Spark, Redis and Amazon's cloud services like EC2, S3, EMR, DynamoDB and RedShift. The candidate should have reasonable programming and design skills to manipulate unstructured and big data More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Master's degree in math/statistics/engineering or other equivalent quantitative discipline, or PhD More ❯
programming languages such as SQL, Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the Customer's production landscape, multi More ❯
programming languages such as SQL, Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the Customer's production landscape, multi More ❯
with Temporal or similar orchestration tools. • Messaging & Streaming: Exposure to RabbitMQ or similar message/streaming broker technologies. • Advanced Technologies: Interest or experience in big data technologies (e.g., HBase, Hadoop), machine learning frameworks (e.g., Spark), and orbit dynamics. Why Join Us? • Innovative Environment: Be part of projects at the cutting edge of space systems and security. • Agile Culture: Work More ❯
of influencing C-suite executives and driving organizational change • Bachelor's degree, or 7+ years of professional or military experience • Experience in technical design, architecture and databases (SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) • Experience implementing serverless distributed solutions • Software development experience with object-oriented languages and deep expertise in AI/ML PREFERRED QUALIFICATIONS • Proven ability to shape market More ❯
more of the following areas: Software Design or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. - Experience in a 24x7 operational services or support environment. - Experience with AWS Cloud services and/or other Cloud offerings. Our More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … for a bachelor's degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: • Two (2) years of Cloud and/or Distributed Computing More ❯
You will play a key role within a small data science team. Client is looking for hands on experience developing solutions for complex data science problems using Python, R, Hadoop, and Greenplum (or other Massively Parallel Processing solutions). REQUIRED SKILLS: Bachelor's Degree in a quantitative or technical field of study, such as Statistics, Mathematics, Computer Science, or More ❯
statistical knowledge Excellent communication and decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large data sets before (billions of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tec Partners
Ansible . Comfortable working with cloud platforms (e.g., AWS, Azure, GCP) and container orchestration tools like Kubernetes . Excellent troubleshooting skills and a collaborative approach. Bonus: Experience with Cloudera , Hadoop , CDSW , or CML is a plus. What's on Offer: Flexible hybrid working arrangements Core benefits including private healthcare, dental, life assurance and pension Optional benefits including health cash More ❯
with rich textual content Experience of Java programming can independently prototype solutions to problems Experience with Recommender System, NLP and Machine Learning libraries Experience with big data technologies (e.g. Hadoop, MapReduce, Cascading, Scalding, Scala) is desirable but not required Unix skills Experience with start-up and R&D environments Strong presentation skills in communicating with experts and novices Language More ❯
Big Data projects developing in Kotlin or any other JVM language. Experience with any of the following: Oracle, Kubernetes/Openshift, Redis, Memcached. Experience with Big Data technologies like Hadoop, Cassandra, Hive. We offer Opportunity to work on cutting-edge projects. Work with a highly motivated and dedicated team. Competitive daily rate. Benefits package including medical insurance and sports More ❯
controls o Experience on starting the front-end buildout from scratch by coordinating across multiple business and technology groups o Experience building complex single-page applications using Abinitio/Hadoop/Hive/Kafka/Oracle and modern MOM technologies o Experienced with Linux/Unix platform o Experience in SCMs like GIT; and tools like JIRA o Familiar More ❯
mathematics or equivalent quantitative field - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Excellent technical publications and material contributions to the CV/ML/AI field as related to image and video processing Our inclusive culture empowers Amazonians to More ❯
as a Product Owner, ideally working within a role in a data-rich environment or similar. Solid understanding of data engineering concepts, including data pipelines, cloud platforms (e.g. GCP, Hadoop, Spark), and data governance. Proven ability to drive delivery in agile environments, handling contending priorities and tight deadlines. Excellent collaborator leadership skills, with the ability to influence across technical More ❯
neural deep learning methods and machine learning PREFERRED QUALIFICATIONS - Experience with popular deep learning frameworks such as MxNet and Tensor Flow - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
neural deep learning methods and machine learning PREFERRED QUALIFICATIONS - Experience with popular deep learning frameworks such as MxNet and Tensor Flow - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
in applied research PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
and/or experience using analytical tools: Python, PHP, JavaScript, Java, Relational databases (Postgres, MS SQL, Oracle, Mysql, etc.), SAS PROC SQL, Hue Database Assistant, Teradata and non-rational Hadoop Some other highly valued skills may include: Advanced knowledge of malicious attack vectors used by cyber fraud adversaries Knowledge of Enterprise security frameworks such as NIST Cybersecurity Framework and More ❯
and/or experience using analytical tools: Python, PHP, JavaScript, Java, Relational databases (Postgres, MS SQL, Oracle, Mysql, etc.), SAS PROC SQL, Hue Database Assistant, Teradata and non-rational Hadoop Some other highly valued skills may include: Advanced knowledge of malicious attack vectors used by cyber fraud adversaries Knowledge of Enterprise security frameworks such as NIST Cybersecurity Framework and More ❯