CD tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Experis
of security principles and best practices for cloud-based solutions. Preferred Skills : Certification in cloud platforms. Experience with big data technologies such as ApacheHadoop, Spark, or Kafka. Knowledge of data governance and compliance frameworks. Familiarity with DevOps practices and tools (e.g., Git, Jenkins, Terraform). All profiles will more »
integrity can be maintained as part of business improvement plans which affect the organisation Review, manage and lead the development of data frameworks (e.g. Hadoop) and analysis of data to ensure accuracy of sources and data resilience Communications and Engagement Identify and understand the business needs, prioritise, and design … recognising entities in free text Experience creating and developing SQL server queries and/or stored procedures Experience using and developing data frameworks (e.g. Hadoop) Experience developing and scripting dashboards and data visualisations using tools such as QlikView, QlikSense and Tableau Experience interpreting and analysing complex data sets from more »
to translate business requirements into high and low-level designs. You'll also define architecture and technical designs, create data flows and integrations using Hadoop, and work closely with product teams throughout testing. Key Responsibilities: Lead Java and Python project development. Design and develop API integrations using Spark. Collaborate … client teams. Stay updated with the latest trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and collaboration skills. more »
As a Hadoop Administrator, you will play a crucial role in managing and maintaining our Hadoop ecosystem to ensure optimal performance, reliability, and security. You will collaborate closely with our data engineers, analysts, and other stakeholders to understand their requirements and provide efficient solutions. Your responsibilities will include … but are not limited to: Installing, configuring, and maintaining Hadoop clusters, including HDFS, YARN, Hive, HBase, Kafka, Spark, and other related technologies. Monitoring cluster health and performance, diagnosing and troubleshooting issues, and implementing solutions to minimize downtime. Capacity planning and scaling the Hadoop infrastructure to accommodate growing data more »