Min 7yrs with Python Big Data & Data lake solutions; PostgreSQL, Clickhouse or SnowFlake etc Cloud Infrasutcurre (AWS services) Data processing pipelines using Kafka, Hadoop, Hive, Storm, or Zookeeper Hands-on team leadership The Reward Joining a fast-growth, successful blockchain business. The role offers fully remote work, a great more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
Spark Scala Developer - Scala/Apache Spark - Hybrid/Leeds - £450-£550 Spark Scala Developer to join our client, one of the biggest financial services organizations in the world, with operations in more than 38 countries. It has an IT infrastructure of 200,000+ servers, 20,000+ database instances … Engineer you will be working for the GDT (Global Data Technology) Team, you will be responsible for: Designing, building, and maintaining data pipelines using Apache Spark and Scala Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP). Mandatory Skills; At least … IT Experience with designing, building, and maintaining data pipelines . At least 4+ Years of experience with designing, building, and maintaining data pipelines using Apache Spark and Scala. Programming languages: Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, Apache Spark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud more »
and availability of the company's software products. Data Processing Pipelines : You'll design and implement data processing pipelines using technologies like Kafka, Hadoop, Hive, Storm, or Zookeeper, enabling real-time and batch processing of data from the blockchain. Hands-on Team Leadership : As a hands-on leader, you more »