We are seeking an experienced Kafka Real-Time Architect to design and implement scalable, high-performance real-time data processing systems leveraging Apache Kafka. In this role, you will be responsible for architecting and managing Kafka clusters, ensuring system scalability and availability, and integrating Kafka with various data processing … approach to addressing business data needs and ensuring optimal system performance. Key Responsibilities: Design & Architecture: Architect and design scalable, real-time streaming systems using Apache Kafka, ensuring they are robust, highly available, and meet business requirements for data ingestion, processing, and real-time analytics. Kafka Cluster Management: Configure, deploy … and troubleshoot issues to maintain smooth operations. Integration & Data Processing: Integrate Kafka with key data processing tools and platforms, including Kafka Streams , Kafka Connect , Apache Spark Streaming , Apache Flink , ApacheBeam , and Schema Registry . This integration will facilitate data stream processing, event-driven architectures, and More ❯
We are seeking an experienced Kafka Real-Time Architect to design and implement scalable, high-performance real-time data processing systems leveraging Apache Kafka. In this role, you will be responsible for architecting and managing Kafka clusters, ensuring system scalability and availability, and integrating Kafka with various data processing … approach to addressing business data needs and ensuring optimal system performance. Key Responsibilities: Design & Architecture: Architect and design scalable, real-time streaming systems using Apache Kafka, ensuring they are robust, highly available, and meet business requirements for data ingestion, processing, and real-time analytics. Kafka Cluster Management: Configure, deploy … and troubleshoot issues to maintain smooth operations. Integration & Data Processing: Integrate Kafka with key data processing tools and platforms, including Kafka Streams , Kafka Connect , Apache Spark Streaming , Apache Flink , ApacheBeam , and Schema Registry . This integration will facilitate data stream processing, event-driven architectures, and More ❯
Big Data, and Cloud Technologies. Hands-on expertise in at least 2 Cloud platforms (Azure, AWS, GCP, Snowflake, Databricks) and Big Data processing (e.g., Apache Spark, Beam). Proficiency in key technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Event Hubs, Kafka, Dataflow, Airflow, and ADF. Strong … with the ability to mentor architects. Mandatory expertise in at least 2 Hyperscalers (GCP/AWS/Azure) and Big Data tools (e.g., Spark, Beam). Desirable: Experience designing Databricks solutions and familiarity with DevOps tools. Coforge is an equal opportunities employer and welcomes applications from all sections of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Focus on SAP
years hands-on with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud environments. Experience with Terraform, Cloud Build, or similar tools for infrastructure automation. Understanding of … available) Responsibilities: Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, ApacheBeam, BigQuery, and Cloud Composer Collaborate with data analysts, scientists, and business stakeholders to understand data requirements Optimize performance and cost-efficiency of More ❯
includes a firm-wide data catalogue detailing key data elements. Skills Required Java and/or Scala Backends Distributed compute concepts: Spark, Data Bricks, ApacheBeam etc Full Stack experience Azure Cloud Technologies Angular/Similar front end Cloud DB/Relational DB: Snowflake/Sybase Unix experience More ❯
includes a firm-wide data catalogue detailing key data elements. Skills Required Java and/or Scala Backends Distributed compute concepts: Spark, Data Bricks, ApacheBeam etc Full Stack experience Azure Cloud Technologies Angular/Similar front end Cloud DB/Relational DB: Snowflake/Sybase Unix experience More ❯
guidance and support to development operations teams. Staying updated with the latest Kafka features, updates and industry practices. Required Skills Experience : Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam. Experience More ❯
guidance and support to development operations teams. Staying updated with the latest Kafka features, updates and industry practices. Required Skills Experience : Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam. Experience More ❯
AWS, or Azure. Experience with CI/CD pipelines for machine learning (e.g., Vertex AI). Experience with data processing frameworks and tools, particularly ApacheBeam/Dataflow is highly desirable. Knowledge of monitoring and maintaining models in production. Proficiency in employing containerization tools, including Docker, to streamline More ❯
guidance and support to development operations teams. Staying updated with the latest Kafka features, updates and industry practices. Required Skills Experience : Extensive Experience with Apache Kafka and ... More ❯
and testing. Contribute to cloud migration for the Treasury platform. Requirements: 3-5 years software development experience. Hands-on expereince with Java, Scala, Spark, Beam and SQL (Sybase) Strong understanding of multi-threading and high-volume server-side development. Familiarity with Snowflake, PowerBI, and cloud platforms (plus). Good More ❯
and testing. Contribute to cloud migration for the Treasury platform. Requirements: 3-5 years software development experience. Hands-on expereince with Java, Scala, Spark, Beam and SQL (Sybase) Strong understanding of multi-threading and high-volume server-side development. Familiarity with Snowflake, PowerBI, and cloud platforms (plus). Good More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
engineer with experience in working with high volume data/low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines More ❯