analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate in Business Analytics more »
quickly Ability to work independently and be self-directed Bachelor's degree in Computer Science or related Experience with big data analytics: Splunk, ELK, Hive, Redshift, etc. (nice to have) In-depth knowledge of streaming back-ends and formats (nice to have) Experience working with Smart/Digital TV more »
Months Location: Milton Keynes Key responsibilities: experience as an Kafka Developer (Ideally Kafka Streams). on experience in Big data technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. and experience using Key value data bases . Experience developing microservices using Spring. and develop cloud-based data … data services for different applications ,usually 24/7 applications with a big performance requirement . Key skills/knowledge/experience: Data Hadoop - Hive and Spark/Scala solid experience advance knowledge - Been able to test changes and issues properly, replicating the code functionality into SQL with Code more »
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, ApacheHive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and more »
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, ApacheHive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and more »
and availability of the company's software products. Data Processing Pipelines : You'll design and implement data processing pipelines using technologies like Kafka, Hadoop, Hive, Storm, or Zookeeper, enabling real-time and batch processing of data from the blockchain. Hands-on Team Leadership : As a hands-on leader, you more »
Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use of Excel spread sheets for analytical purposes An MSc or PhD in Data Science or an analytical subject (Physics, Mathematics more »
Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use of Excel spread sheets for analytical purposes MSc or PhD in Data Science or an analytical subject (Physics, Mathematics, Computing more »
Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use of Excel spread sheets for analytical purposes An MSc or PhD in Data Science or an analytical subject (Physics, Mathematics more »