to translate business requirements into high and low-level designs. You'll also define architecture and technical designs, create data flows and integrations using Hadoop, and work closely with product teams throughout testing. Key Responsibilities: Lead Java and Python project development. Design and develop API integrations using Spark. Collaborate … client teams. Stay updated with the latest trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and collaboration skills. more »
recovery strategies to protect against data loss and ensuring business continuity. Planning and forecasting resource requirements to ensure scalability and optimal performance of the Hadoop infrastructure. Installing and configuring Hadoop clusters, including components like HDFS, YARN, MapReduce, Hive, HBase, etc. Ensuring the health and performance of Hadoopmore »
As a Hadoop Administrator, you will play a crucial role in managing and maintaining our Hadoop ecosystem to ensure optimal performance, reliability, and security. You will collaborate closely with our data engineers, analysts, and other stakeholders to understand their requirements and provide efficient solutions. Your responsibilities will include … but are not limited to: Installing, configuring, and maintaining Hadoop clusters, including HDFS, YARN, Hive, HBase, Kafka, Spark, and other related technologies. Monitoring cluster health and performance, diagnosing and troubleshooting issues, and implementing solutions to minimize downtime. Capacity planning and scaling the Hadoop infrastructure to accommodate growing data more »
CD tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate more »
Scala Data Engineer (Cloudera, Hadoop and CI/CD) - Banking Client - Brussels Duration: 1 year freelance contract Rate: €500 - €800 per day Hybrid working INSIDE OF IR35 You will join the AIR (Analytics, Insight and Reporting tribe) in GDC division. You will be join our dedicated in-house team … and frameworks . You ideally have knowledge of English , knowledge of other European languages is a plus Nice to haves You know that Cloudera, Hadoop and CI/CD aren't popular video games. You have heard of tools like Apache Spark, Impala and/or Kafka. You have more »
Senior Data Engineer - Python/Hadoop/Spark - sought by leading investment bank based in London - Hybrid - contract *inside IR35 - umbrella* Key Responsibilities: Design and implement scalable data pipelines that extract, transform and load data from various sources into the data Lakehouse. Help teams push the boundaries of analytical … ETL processes, and data warehousing. Significant exposure and hands on at least 2 of the programming languages - Python, Java, Scala, GoLang. Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Experis
of security principles and best practices for cloud-based solutions. Preferred Skills : Certification in cloud platforms. Experience with big data technologies such as ApacheHadoop, Spark, or Kafka. Knowledge of data governance and compliance frameworks. Familiarity with DevOps practices and tools (e.g., Git, Jenkins, Terraform). All profiles will more »