computing engineers, and systems engineers, to ensure efficient data processing, secure storage, and insightful analysis. This position requires hands-on experience and expertise with Kafka, AWS, and MongoDB Responsibilities Upgrade and/or maintain Kafka clusters in an AWS/Kubernetes cloud environment Design, Develop, and Deploy Kafka clusters in an AWS/Kubernetes cloud environment Manage schema evolution and security using Kafka Schema Registry and Kafka Security Manager (KSM) Oversee the assessment, design, building, and maintenance of scalable platforms Requirements Kafka experience: Kafka Connect, Kafka Streams, and other Kafka ecosystem … tools Kafka knowledge of/with: Kafka Schema Security Manager, Kafka Schema Registry, and Kafka architecture and internals Experience with Apache NiFI Experience with AWS Experience with MongoDB, Redis Experience with agile development methodologies Data Pipeline Experience: You will be involved with acquiring and prepping data More ❯
Job ID Number R5291 Employment Type Full time Worksite Flexibility Remote Job Summary As the Lead Confluent/Kafka Engineer, you will be responsible for architecting and designing scalable, fault-tolerant, and high-performance Kafka-based data streaming solutions. Job Description We are looking for a Lead Confluent …/Kafka Engineer for a contract-to-hire opportunity that is full-time and fully remote . What You'll Do Architect and design scalable, fault-tolerant, and high-performance Kafka-based data streaming solutions Lead technical design sessions and provide architectural guidance to junior engineers Develop and … maintain Confluent Platform components, including Kafka Connect, Kafka Streams, Flink, TableFlow and ksqlDB Implement and manage monitoring and alerting systems for the Kafka cluster Lead troubleshooting and resolution of complex issues within the data streaming platform Perform root cause analysis and implement corrective actions to prevent future More ❯
s business problems. We're looking for a Software Engineer with a passion for streaming data and a strong interest in working with ApacheKafka and modern data pipelines . This role is hands-on, focused on building stream transformations and data integrations , not on maintaining infrastructure. You'll … help deliver business-critical data flows using Kafka Connect, Kafka Streams, and Snowflake - enabling real-time analytics and platform integration across enterprise systems. What You'll Do: • Build and maintain data pipelines using Kafka Connect, Kafka Streams, and KSQL. • Implement real-time transformations, enrichments, and routing … logic for streaming data. • Integrate Kafka with databases and cloud data platforms like Snowflake. • Collaborate with leads and stakeholders to turn data requirements into running code. • Ensure reliable data delivery across producers, Kafka topics, and downstream consumers. Must-Have Requirements: • Experience as a software engineer, working with Java More ❯
advisor, you will collaborate closely with product management and engineering, serving as a key advocate for Confluent's platform. This role demands expertise in Kafka, distributed systems, and pre-sales engineering, while engaging with cross-functional teams to drive product success and customer outcomes. Ideal candidates may have backgrounds … Time Architectures Understand customer challenges with traditional Data Warehouses, Data Lakes, and Batch Analytics workflows, and guide them toward real-time, distributed architectures using Kafka, Flink, Kafka Streams, and modern ETL/ELT frameworks. Help customers optimize their data platforms by focusing on early-stage data enrichment, filtering … better performance and cost efficiency. Provide Technical Expertise Assist customers and sales teams in designing, deploying, and optimizing real-time data streaming platforms, integrating Kafka with distributed processing, and ensuring alignment with business goals. Architect solutions to unify operational and analytical workloads, enabling a data mesh or streaming-first More ❯
to manage feature delivery through sprints and scrums Desirable: Familiarity with cloud platforms (AWS, Azure) and containerisation (Docker, Kubernetes) Exposure to messaging systems like Kafka or Solace Financial services background or experience working on large-scale enterprise applications Understanding of SDLC and scalable web application architecture About Barclays Barclays More ❯
React , Angular , or Vue.js for full-stack development is a plus. Event-Driven Architecture: Experience with event-driven architectures or message queuing systems (e.g., Kafka , RabbitMQ ) is beneficial. Education: A degree in Computer Science , Engineering , or a related field is preferred but not required. More ❯
Have Previous Experience: 6+ years of professional Java (11+) development experience Experience with Spring Boot, JPA/Hibernate, and RESTful API development Messaging systems (Kafka, ActiveMQ, or RabbitMQ) Hands-on with CI/CD pipelines (Jenkins, GitLab CI, or Azure DevOps) Experience with SQL and NoSQL databases (PostgreSQL, MongoDB More ❯
to translate business and research needs into engineering roadmaps. Collaborative mindset, capable of interfacing with multi-disciplinary teams. Nice to Have: Experience with Redis, Kafka, or similar real-time streaming/data platforms. Familiarity with financial APIs, FIX protocol, or trading exchange integrations. Education: Bachelor’s or Master’s More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Trireme
to translate business and research needs into engineering roadmaps. Collaborative mindset, capable of interfacing with multi-disciplinary teams. Nice to Have: Experience with Redis, Kafka, or similar real-time streaming/data platforms. Familiarity with financial APIs, FIX protocol, or trading exchange integrations. Education: Bachelor’s or Master’s More ❯
including modules such as Spring Data, Security, and Cloud), Microservice Architecture, and SQL/NoSQL databases. Proficiency with ORM frameworks (e.g., Hibernate), messaging tools (Kafka, Kinesis, Redis), and cloud infrastructure technologies (AWS, Docker, Kubernetes, Terraform). Strong understanding of CI/CD pipelines, observability tools (e.g., DataDog), and Agile More ❯
development processes, collaborate with cross-functional teams, and contribute to continuous improvement initiatives. Desirable Exposure to cloud platforms (AWS, Azure, GCP) and messaging tools (Kafka, Solace) Experience with containerisation (e.g., Docker, Kubernetes) Background in financial services or enterprise-grade applications Understanding of SDLC and web application architecture More ❯
AI/ML: TensorFlow or Databricks for predictive analytics. Integration Technologies: API Management: Apigee, AWS API Gateway, or MuleSoft. Middleware: Red Hat Fuse or Kafka for asynchronous communication. Cloud Technologies: Providers: AWS (EC2, Lambda), Azure (AKS), or Google Cloud (BigQuery). Containers: Docker, Kubernetes for scalable deployments. Serverless: AWS More ❯
AI/ML: TensorFlow or Databricks for predictive analytics. Integration Technologies: API Management: Apigee, AWS API Gateway, or MuleSoft. Middleware: Red Hat Fuse or Kafka for asynchronous communication. Cloud Technologies: Providers: AWS (EC2, Lambda), Azure (AKS), or Google Cloud (BigQuery). Containers: Docker, Kubernetes for scalable deployments. Serverless: AWS More ❯
little supervision. • Proficiency and/or knowledge of the following: o Angular, JavaScript, CSS, & HTML, & Material UI, JAVA etc o AWS Cloud, EKS, Kubernetes, Kafka, Jenkins, Ansible, Docker, GIT, Red Hat Enterprise Linux (RHEL) o Oracle/MySQL/Postgres, microservices, Springboot, Java. • KABANA or Dynatrace dashboard for monitoring More ❯
The full stack engineers for Commercial Technology will be proficient in C# (.NET Core), Visual Studio, Apache Web Services, XML, and Restful-API/Kafka & event-driven integration capabilities. Must have development experiences in Azure Integration Services (Logic Apps, Azure Functions, Azure Event Hub, Azure Redis, Cosmos DB, Blob More ❯
The full stack engineers for Commercial Technology will be proficient in C# (.NET Core), Visual Studio, Apache Web Services, XML, and Restful-API/Kafka & event-driven integration capabilities. Must have development experiences in Azure Integration Services (Logic Apps, Azure Functions, Azure Event Hub, Azure Redis, Cosmos DB, Blob More ❯
of tiered web application architectures Comfortable in both Unix and Windows environments Development Experience with Spring Boot to build Experience with Messaging Frameworks like Kafka, RabbitMQ Experience with Software Version Control - Git, JIRA etc. for Issue/Project tracking Experience with Logging Frameworks like log4j, Splunk, Stackdriver Front End More ❯
is a backend-focused role, leveraging your strong background in Java, Python, workflow automation solutions (ex: Workato), web services (REST/OpenAPI), messaging frameworks (Kafka), data platforms (Snowflake), cloud technologies (AWS/GCP), and containerization (Kubernetes/Docker) to build innovative solutions. You will use your strong knowledge of More ❯
Working in headless RHEL environmentsDesired qualifications: Clearance & Certfication: Currently holds an active Secret clearance. CompTIA Security+ certified. Tools: Data pipeline tools (i.e., Apache NiFi, Kafka, Databricks, MinIO) Cyber security tools (i.e., OpenSCAP, Snyk, JFrog Xray, Anchore) Secret management tools such as HashiCorp Vault Experience: Cloud computing infrastructure or development More ❯
JPA Experience working with Object Relational Mapping (ORM) or NoSQL databases, such as MySQL, PostgreSQL, and Elasticsearch Familiarity with message queue implementations, such as Kafka Good understanding of Git, GitLab, CI/CD pipelines, and CM best practices Experience developing ETL/Big Data applications Experience designing RESTful web More ❯
tools, such as, Jira and Confluence Preferred Qualifications: - Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, - Working knowledge with public keys and digital certificates - Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit More ❯
domain expertise and ability to drive the team, integrations, migrations, create approach. Good to have experience in Angular, and front-end technologies. Understanding of Kafka, PCF, Integration patterns, Security standards, Concurrency and Multi-threading, Collections, PostgreSQL, Azure, Docker, Kubernetes Hands-on, high-energy, detail-oriented, proactive, and able to More ❯
tools, such as, Jira and Confluence Desired Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit More ❯
Scripts What we'd like you to have Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit More ❯