Spark Scala Engineer
Leeds, Yorkshire, United Kingdom
Pyramid Consulting Europe Ltd
Data technologies: You'll likely work with various Big Data technologies alongside Spark, including: Hadoop Distributed File System (HDFS) for storing large datasets Apache Kafka for Real Time data streaming Apache Hive for data warehousing on top of HDFS Cloud platforms like AWS, Azure, or GCP for deploying and … Programming languages: Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies: Understanding of HDFS, Kafka, Hive, and cloud platforms is valuable. Data engineering concepts: Knowledge of data warehousing, data pipelines, Datamodelling, and data cleansing techniques is crucial. Problem-solving more »
Employment Type: Contract
Rate: GBP Annual
Posted: