programming languages such as SQL, Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the Customer's production landscape, multi More ❯
programming languages such as SQL, Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the Customer's production landscape, multi More ❯
/Portals, Workflow, Enterprise Service Bus(ESB), Web Services, Message brokers, Relational Databases. Exposure to common industry platforms and programming languages - Appian BPM, IBM WebSphere, Mule, LAMP/JBOSS, HADOOP, Java, Microsoft/.Net is preferred. Masters degree in Computer Science or other related field Experience working with Appian Sites. Java developer with Maven builds experience is a strong More ❯
year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Two (2) years of experience with programs utilizing Big Data Cloud technologies CERTIFICATIONS: Cloudera Certified Hadoop Developer, CompTIA Cloud Plus (Cloud+), AWS, and Microsoft Azure foundational or fundamental level certification may be substituted for one (1) year of Cloud experience. AWS and Microsoft Azure Associate, Expert More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … for a bachelor's degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. The following Cloud related experiences are required: • Two (2) years of Cloud and/or Distributed Computing More ❯
You will play a key role within a small data science team. Client is looking for hands on experience developing solutions for complex data science problems using Python, R, Hadoop, and Greenplum (or other Massively Parallel Processing solutions). REQUIRED SKILLS: Bachelor's Degree in a quantitative or technical field of study, such as Statistics, Mathematics, Computer Science, or More ❯
statistical knowledge Excellent communication and decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large data sets before (billions of More ❯
A Bachelor's degree in Engineering, Systems Engineering, Computer Science, or Mathematics is highly desired and may be substituted for two (2) years of experience • Certifications: Must possess a Hadoop/Cloud System Administrator Certification or equivalent Cloud System/Service certification that is DoD 8570 compliant, such as: AWS Certified Developer - Associate AWS DevOps Engineer - Professional Certified Kubernetes More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tec Partners
Ansible . Comfortable working with cloud platforms (e.g., AWS, Azure, GCP) and container orchestration tools like Kubernetes . Excellent troubleshooting skills and a collaborative approach. Bonus: Experience with Cloudera , Hadoop , CDSW , or CML is a plus. What's on Offer: Flexible hybrid working arrangements Core benefits including private healthcare, dental, life assurance and pension Optional benefits including health cash More ❯
with rich textual content Experience of Java programming can independently prototype solutions to problems Experience with Recommender System, NLP and Machine Learning libraries Experience with big data technologies (e.g. Hadoop, MapReduce, Cascading, Scalding, Scala) is desirable but not required Unix skills Experience with start-up and R&D environments Strong presentation skills in communicating with experts and novices Language More ❯
Big Data projects developing in Kotlin or any other JVM language. Experience with any of the following: Oracle, Kubernetes/Openshift, Redis, Memcached. Experience with Big Data technologies like Hadoop, Cassandra, Hive. We offer Opportunity to work on cutting-edge projects. Work with a highly motivated and dedicated team. Competitive daily rate. Benefits package including medical insurance and sports More ❯
Elasticsearch, Mongo DB, Node, and Docker Hands-on experience with AWS cloud US Citizenship and an active TS/SCI with Polygraph security clearance required Desired Qualifications: Experience with Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi a plus Experience with Sequalize and Umzug Familiarity with or ability to learn Git, Jenkins, FontAwesome, and LeftHook Bachelor's and/ More ❯
to collaborate with the team and do POCs and troubleshoot technical issues. Expert in QA Processes and tools - Able to write Test plan and Test strategies. Essential Skills Databricks Hadoop Tableau is a big plus Proficiency in Python. Experience in product development. Additional Skills & Qualifications Ability to work effectively within a team environment. Strong analytical and problem-solving skills. More ❯
mathematics or equivalent quantitative field - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Excellent technical publications and material contributions to the CV/ML/AI field as related to image and video processing Our inclusive culture empowers Amazonians to More ❯
analysis to support commercial activities of a product business • Strong experience with analytical solutions designed for payments or a related financial services sectors preferred • Well-versed in Excel, Python, Hadoop, Tableau, and related analytics tools and experience running analysis on large datasets • Strategic mindset to solve complex and ambiguous problems • Highly organized and able to deal with multiple and More ❯
analysis to support commercial activities of a product business • Strong experience with analytical solutions designed for payments or a related financial services sectors preferred • Well-versed in Excel, Python, Hadoop, Tableau, and related analytics tools and experience running analysis on large datasets • Strategic mindset to solve complex and ambiguous problems • Highly organized and able to deal with multiple and More ❯
risk assessments and remediation efforts Desired Skills & Experience: Experience with Data Management, data warehousing and large data concepts and tools Experience with Oracle, SQL Server, Alteryx, Teradata, DB2, Netezza, Hadoop, SAP eLedger, Rally, Visio, ETL tools and Data Management tools Knowledge of Global Technology, Enterprise Data Management & Enterprise Data Architecture programs, strategies and standards Finance, Regulatory Reporting and/ More ❯
nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE. Current enhanced DV Security Clearance Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir Foundry Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER More ❯
demonstrated work experience with: o Distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc. o Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc. o Hadoop Distributed File System (HDFS) o Serialization such as JSON and/or BSON • 4 years of SWE experience may be substituted for a More ❯
responsibilities include, but are not limited to: • Maintaining, enhancing, and refactoring infrastructure operations code in Python to help automate tasks that ensure systems are properly up and running. • Supporting Hadoop clusters running on Cloudera by applying critical security patches, remediating vulnerabilities, and pro-actively planning to ensure clusters have sufficient storage. • Familiarity with Nifi, and the ability to partner More ❯
enterprise-level systems; Excellent object-oriented design skills, including OOA/OOD; Experience with multi-tier architectures and service-oriented architecture; Exposure to and understanding of RDBMS, NoSQL, and Hadoop is desirable; Knowledge of the software development lifecycle and agile practices, including TDD/BDD; Strategic thinking, collaboration, and consensus-building skills. Please note: Familiarity with DevOps is important More ❯
sensitive nature of the data we handle. Required - Proficiency in AWS services and tools related to data storage, processing, and analytics. - Strong experience with big data technologies such as Hadoop, Spark, or similar frameworks. - Active SC clearance is essential for this role. More ❯
fostering an environment where employees can thrive and make a difference. Key Responsibilities: Develop and maintain applications using distributed data storage and parallel computing technologies, including Oracle, Postgres, Cassandra, Hadoop, and Spark. Utilize back-end applications and data integration tools such as Java and Groovy. Create user-facing applications that support mission needs and enhance user experience. Work with More ❯
Collaborate within cross-functional Integrated Product Teams (IPTs) to drive system integration and ensure mission success • Research and implement distributed storage, routing and querying algorithms, leveraging technologies like HDFS, Hadoop, HBase, and Accumulo (BigTable) Desired Qualifications: • Strong background in network systems engineering, with a clear understanding of data routing, security, and optimization • Experience in the integration of COTS and More ❯