with rich textual content Experience of Java programming can independently prototype solutions to problems Experience with Recommender System, NLP and Machine Learning libraries Experience with big data technologies (e.g. Hadoop, MapReduce, Cascading, Scalding, Scala) is desirable but not required Unix skills Experience with start-up and R&D environments Strong presentation skills in communicating with experts and novices Language More ❯
of prior relevant experience or a MS degree and 10+ years of prior relevant experience. Preferred Qualifications: • AWS certifications • Experience working with large scale data technologies (e.g. AWS Aurora, Hadoop, Elasticsearch, etc.) • Experience working within a low to high development/deployment environment Benefits and Perks: • Competitive salary and comprehensive benefits package • Commitment to diversity and inclusion in the More ❯
techniques for LLMs PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy, etc. - Experience with large scale distributed systems such as Hadoop, Spark, etc. - PhD in math/statistics/engineering or other equivalent quantitative discipline - Experience with conducting research in a corporate setting - Experience in patents or publications at top More ❯
Should desirably have knowledge of modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab) and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Get-up-and-go attitude, maturity, responsibility and strong work ethic. Strong ability to More ❯
Should desirably have knowledge of modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab) and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Get-up-and-go attitude, maturity, responsibility and strong work ethic. Strong ability to More ❯
Should desirably have knowledge of modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab) and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Get-up-and-go attitude, maturity, responsibility and strong work ethic. Strong ability to More ❯
Should desirably have knowledge of modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab) and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Get-up-and-go attitude, maturity, responsibility and strong work ethic. Strong ability to More ❯
london (city of london), south east england, united kingdom
Management Solutions
Should desirably have knowledge of modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab) and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Get-up-and-go attitude, maturity, responsibility and strong work ethic. Strong ability to More ❯
resources here to help you develop into a better-rounded professional. BASIC QUALIFICATIONS - 7+ years of technical specialist, design and architecture experience - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 7+ years of consulting, design and implementation of serverless distributed solutions experience - 5+ years of software development with object oriented language experience - 3+ years of More ❯
CVS) Ability to build applications from source and troubleshoot compiling issues Experience with compilers such as (GNU, Intel, and AOCC) Storage Experience installation and tuning (ZFS, XFS, GPFS, Luster, Hadoop, Ceph, Object Storage) Shell scripting experience (Bash, Perl, Python) Virtualization Experience (VMWare, Xen, Hyper-V, KVM, etc.) Experience with x86 bootstrap process (BIOS, RAID, Fiber Channel, etc.) Experience with More ❯
such as weakly supervised learning, reinforcement learning, semantic search, knowledge-graph construction - Authored research publications, participation in ML competitions, working demos/repos - Experience with distributed computational frameworks (Spark, Hadoop, Kubernetes, Databricks) - Curiosity to learn more about the legal, tax or government domain (prior experience is not required) - Familiarity with traditional statistical methods, modern deep learning frameworks, prompting LLMs More ❯
Analytic exposure is a big plus. Java is a must, but these will strengthen your case: Data Analytic development experience Agile development experience Familiarity with/interest in ApacheHadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work on high visibility tasking Willingness/ability More ❯
configuration management/deployment tools such as Ansible and Red Hat Satellite. Experience with firewall and switch configuration, virtualization technologies like VMware, and software technologies such as Apache, Docker, Hadoop, MySQL, and network services (DHCP, DNS, LDAP) is essential. Experience working within governance frameworks like the National Cyber Security Centre guidance and the Government Digital Service Technology Code of More ❯
maintaining, troubleshooting, tuning) of web architecture and related applications such as the following: Apache, Nginx, Python, MySQL, Postgres, MongoDB, Postfix, CDN integrations. Experience managing data warehouse platforms & tooling: ex. Hadoop, Kafka, Cassandra. Advanced knowledge and experience creating, maintaining and debugging shell scripts. The ideal candidate will be comfortable in "non-siloed" environments and have an appetite to research, test More ❯
fostering an environment where employees can thrive and make a difference. Key Responsibilities: Develop and maintain applications using distributed data storage and parallel computing technologies, including Oracle, Postgres, Cassandra, Hadoop, and Spark. Utilize back-end applications and data integration tools such as Java and Groovy. Create user-facing applications that support mission needs and enhance user experience. Work with More ❯
focus and attention to detail. • Knowledge of Spring MVC, Hibernate and Spring frameworks. • Understanding of Html 5, CSS3, Java Script, AJAX based programming and jQuery/Hibernate. • Experience in Hadoop, Cassandra, Big data technologies is a plus. SKILLS AND CERTIFICATIONS java REST spring hibernate Additional Information All your information will be kept confidential according to EEO guidelines. Direct Staffing More ❯
available now and seeking an exciting new role where they can take ownership and responsibility. The role involves designing, implementing, and managing Data and Data Analytics systems. Experience with Hadoop, Splunk, BI (Business Intelligence), NoSQL, Infrastructure, Architecture , and the design and implementation of previous projects is essential. We need someone who can hit the ground running . If you More ❯
tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as ApacheHadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring or … tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as ApacheHadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such as More ❯
and Python. • Support the deployment and management of AWS services including EC2, S3, and IAM. • Work with the team to implement and optimize big data processing frameworks such as Hadoop and Spark. • Help with the integration and use of various compute instances for specific data processing needs. • Contribute to the development of tools and algorithms for data analysis and … experience in data engineering • Bachelor's degree in Data Engineering, Data Science, Computer Science, Information Technology, or a related field OR equivalent practical experience. • Basic knowledge of Spark and Hadoop distributed processing frameworks. • Familiarity with AWS services, particularly EC2, S3, and IAM. • Some experience with programming languages such as Scala, PySpark, Python, and SQL. • Understanding of data pipeline development More ❯
Nassau Bahamas, Singer Island Florida, Paradise Island Bahamas, or the Cambridge Hyatt Resort Desired Skills: • Proficient in Java • Comfortable working in a Linux environment • Experience with Apache Open Source Hadoop, Apache Open Source Accumulo, Apache Open Source NiFi • Familiarity with Context chaining and Graph theory • Experience with Containerization - Docker, Kubernetes • Experience with Enabling tools: Git, Maven, Jira • Experience with More ❯
display, video, mobile, programmatic, social, native), considering viewability, interaction, and engagement metrics. Create dashboards and deliver usable insights to help steer product roadmaps. Utilize tools such as SQL, R, Hadoop, Excel to hypothesize and perform statistical analysis, AB tests, and experiments to measure the impact of product initiatives on revenue, technical performance, advertiser & reader engagement. Candidates should have analysis More ❯
distributed systems, data structures, and consistency algorithms Java JDK 17+ Knowledge of the following is desirable: Data Serialization and Transport (gRPC, Shared Memory, Protobuf) Distributed Data Stores (MongoDB, ElasticSearch, Hadoop, CockroachDB) Designing APIs (well-crafted, supporting backwards compatibility) In-memory Data Stores (SQLite, RocksDB) Popular Java Frameworks (Spring, Hibernate) Performance Benchmarking Bachelor's degree in Computer Science or related More ❯
the latest tools and technologies to design, develop, and implement solutions that transform businesses and drive innovation. What will your job look like 4+ years of relevant experience in Hadoop with Scala Development Its mandatory that the candidate should have handled more than 2 projects in the above framework using Scala. Should have 4+ years of relevant experience in … handling end to end Big Data technology. Meeting with the development team to assess the company's big data infrastructure. Designing and coding Hadoop applications to analyze data collections. Creating data processing frameworks. Extracting data and isolating data clusters. Testing scripts and analyzing results. Troubleshooting application bugs. Maintaining the security of company data. Training staff on application use Good … platform & data development roles 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture High Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environments Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical More ❯
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. 2. The following Cloud related experiences are required: 3. a. Two (2) years of Cloud and/ More ❯