1 of 1 Permanent Sqoop Jobs

Hadoop Developer (Spark/Scala)

Hiring Organisation
Capgemini
Location
City of London, London, United Kingdom
Scala. Build, operate, monitor, and troubleshoot Hadoop clusters. Develop robust ETL workflows using Spark, Hive, and Pig. Create and manage data ingestion pipelines using Sqoop, Flume, or Kafka. Optimize MapReduce jobs and efficiently manage HDFS storage. Collaborate with data scientists and analysts to meet data requirements. Ensure data security, compliance … have). Solid understanding of Hadoop ecosystem: HDFS, MapReduce, YARN. Experience with Hive, Pig, and HBase. Knowledge of data ingestion tools such as Sqoop, Flume, and Kafka. Ability to optimize distributed data processing jobs. Exposure to cloud platforms such as AWS, GCP, or Azure is a plus. ...