Data technologies: You'll likely work with various Big Data technologies alongside Spark, including: Hadoop Distributed File System (HDFS) for storing large datasets ApacheKafka for Real Time data streaming Apache Hive for data warehousing on top of HDFS Cloud platforms like AWS, Azure, or GCP for deploying and … Programming languages: Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies: Understanding of HDFS, Kafka, Hive, and cloud platforms is valuable. Data engineering concepts: Knowledge of data warehousing, data pipelines, Datamodelling, and data cleansing techniques is crucial. Problem-solving more »
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
LSA Recruit
Location: Leeds Hybrid working 1-2 days in office. Contract- Inside IR35 (As per Market standards) Position Overview: We are seeking a highly skilled and experienced Palantir Foundry Specialist/Architect to join our team. The ideal candidate will possess more »
and React on the Front End. Good cloud exposure, ideally in AWS but Azure or GCP is also fine. Knowledge of Java, Spring Boot, Kafka and MySQL. Strong experience in microservices and REST APIs. If this role sounds of interest, please apply and someone will be in touch regarding more »
Leeds, England, United Kingdom Hybrid / WFH Options
BJSS
get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable more »