pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine … experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with ApacheSpark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse … privacy, handling of sensitive data (e.g. GDPR) Experience in event-driven architecture, ingesting data in real time in a commercial production environment with SparkStreaming, Kafka, DLT or Beam. Understanding of the challenges faced in the design and development of a streaming data pipeline More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
scalability, including understanding and explaining features like KRAFT. Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, SparkStreaming, Schema Registry, Flink, and Beam. Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. … source Kafka Focus on disaster recovery aspects Knowledge of Kafka resiliency and new features like KRAFT Experience with real-time technologies such as Spark Required Skills & Experience Extensive experience with Apache Kafka and real-time architecture including event-driven frameworks. Strong knowledge of Kafka Streams, Kafka Connect, SparkStreaming, Schema Registry, Flink, and Beam. Experience with cloud platforms such as GCP Pub/Sub. Excellent problem-solving skills. Knowledge & Experience/Qualifications Knowledge of Kafka data pipelines and messaging solutions to support critical business operations and enable real-time data processing. Monitoring Kafka performance More ❯
Kafka clusters to ensure high availability and scalability. Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, SparkStreaming Schema Registry, Flink and Beam. Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. … Required Skills Experience : Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. Strong Knowledge of Kafka Streams, Kafka Connect, SparkStreaming, Schema Registry, Flink and Beam. Experience with cloud platform such as GCP Pub/Sub. Excellent problem-solving skills. Knowledge & Experience More ❯
in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, SparkStreaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware More ❯