and understanding of current cyber security threats, actors and their techniques Experience with data science, big data analytics technology stack, analytic development for endpoint and network security, and streaming technologies (e.g., Kafka, SparkStreaming, and Kinesis) Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our More ❯
recovery process/tools Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, SparkStreaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware and Cabling set-up experience More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Experis
grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and SparkStreaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support operational data … enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience working with Linux systems More ❯
prem systems. Job Responsibilities/Objectives Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimize workflows using Apache Airflow and SparkStreaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support operational data … standards. Required Skills/Experience The ideal candidate will have the following: Strong experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience working with Linux systems More ❯