developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
developer with an innovative mindset. Design and develop code that consistently adheres to good programming practices. Design, develop, and maintain high volume Java or Scala based data processing jobs using industry standard tools and frameworks in the Hadoop ecosystem, such as Spark, Kafka, Hive, Impala, Avro, Flume, Oozie, and Sqoop … innovative technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud … and written communication skills Commitment to working in an Agile environment and upholding its principles. Passionate technologists with an innovative mindset Strong Java or Scala skills and UI skills Experience with designing and implementing high volume data processing jobs is a required. Working knowledge of Spark on EMR is a more »
Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging more »
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard Ltd
of ours... Tech stack - Kotlin, AWS, Postgres, Terraform, Ansible Ideally looking for commercial Kotlin experience but can also consider candidates from a Java or Scala background too. Base salary - £50k-110k + bonus and generous package 3 days a week in the office in London, 2 days WFH You must more »
are mission led to use AI for the greater good. Skills: • Experience with FE: Javascript, Typescript and React • Experience with BE: Either with Java, Scala or Python • Strong stakeholder management • Ability to mentor other members in the team. • Experience with testing libraries Benefits: • Private healthcare • Remote working • Mental health support more »