on test automation, identifying test cases to be automated, writing scripts and integrating them into the development lifecycle. Engaging with the wider DevOps lifecycle tool chain, like Kubernetes and Kafka, with future DevOps opportunities for you. Collaborate with developers to improve code and streamline testing. Work with HIL and hardware interactions (no experience required). Mid-level role, around More ❯
Engineering Manager - team of 6 💻 NodeJS, Typescript, React, AWS, Kafka (Backend leaning) - can come from any tech background 🏠 Central London - 3 days a week in office 💵 £130/£135k base + bonus + equity - total comp c. £190k per annum Evoke is working with a publicly traded technology unicorn business in the b2b SaaS space that is looking to More ❯
Engineering Manager - team of 6 💻 NodeJS, Typescript, React, AWS, Kafka (Backend leaning) - can come from any tech background 🏠 Central London - 3 days a week in office 💵 £130/£135k base + bonus + equity - total comp c. £190k per annum Evoke is working with a publicly traded technology unicorn business in the b2b SaaS space that is looking to More ❯
Science or closely related and 24 months of experience as a Technical Consultant or related Special Requirements: 24 months prior experience working with (1) integration technologies (e.g. REST APIs, Kafka, Space API, file-based transfers, ETL pipelines, and API gateways), (2) Single Sign-On (SSO) solutions (e.g., Okta, Azure AD), (3) SQL, PowerShell, Python, and batch scripting for automation More ❯
and database monitoring alerts. Required Skills & Experience Strong hands-on knowledge of Snowflake and SQL Server (including SSIS).Experience in ETL, data migration, and performance optimization .Familiarity with Oracle, Kafka, ODI, and Azure SQL DB .Proven ability in incident, problem, and change management .Good understanding of data governance, security, and auditing frameworks .Exposure to monitoring and observability tools (e.g. More ❯
business goals. Minimum Qualifications: Position requires a Bachelors degree in Computer Science, Engineering, or a related field followed by 4+ years of experience in a modern development stack Golang, Kafka, REST API More ❯
blue-chip corporate. Demonstrable experience leading AI value delivery, CoE mobilisation, or multi-disciplinary product teams. Technical literacy across data and AI ecosystems (Azure, GCP, Databricks, Snowflake, Power BI, Kafka, LLMs). Exceptional stakeholder management up to CIO/CDO level with a track record of influence and measurable delivery. Strong grasp of responsible-AI, governance frameworks, and value More ❯
vector search, RAG, and feature engineering Implement secure access and governance controls including RBAC, SSO, token policies, and pseudonymisation frameworks Support batch and streaming data flows using technologies like Kafka, Airflow, and Terraform Monitor and optimise cloud resource usage to ensure performance and cost efficiency Collaborate with cross-functional teams on architecture decisions, technical designs, and data governance standards More ❯
vector search, RAG, and feature engineering Implement secure access and governance controls including RBAC, SSO, token policies, and pseudonymisation frameworks Support batch and streaming data flows using technologies like Kafka, Airflow, and Terraform Monitor and optimise cloud resource usage to ensure performance and cost efficiency Collaborate with cross-functional teams on architecture decisions, technical designs, and data governance standards More ❯
enterprise data architecture, data strategy, and cloud architecture and roadmap development of data transformation programme. Expertise in operational data use cases: real-time data processing, APIs, messaging/streaming (Kafka, Pub/Sub, MQ), MDM, and transactional data flows. Strong background in data modeling (conceptual, logical, physical) with a focus on domain-driven design (DDD). Proven ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Paritas Recruitment
enterprise data architecture, data strategy, and cloud architecture and roadmap development of data transformation programme. Expertise in operational data use cases: real-time data processing, APIs, messaging/streaming (Kafka, Pub/Sub, MQ), MDM, and transactional data flows. Strong background in data modeling (conceptual, logical, physical) with a focus on domain-driven design (DDD). Proven ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Bounce Digital
re Looking For ✅ 3+ years’ experience leading high-performing engineering teams (ideally with senior or staff-level engineers). ✅ Strong full-stack or backend background – ideally with Node.js, React, Kafka or similar. ✅ A balance of strategic thinking and bias for action – someone who can move things forward, not overthink them. ✅ Excellent stakeholder management – you know how to align technical More ❯
re Looking For ✅ 3+ years’ experience leading high-performing engineering teams (ideally with senior or staff-level engineers). ✅ Strong full-stack or backend background – ideally with Node.js, React, Kafka or similar. ✅ A balance of strategic thinking and bias for action – someone who can move things forward, not overthink them. ✅ Excellent stakeholder management – you know how to align technical More ❯
to create and optimize data models, indexes, stored procedures, access, and scalability. Non-Relational Data Systems: Experience of working with NoSQL, streaming, graph, timeseries, or edge databases including MongoDB, Kafka, JanusGraph, and InfluxDB. Understanding of binary encoding and compression approaches for complex data structure storage and retrieval. Data Strategy: Production experience delivering aligned data models that enable resource isolation More ❯
migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/Delta Lake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting AI/ML workloads, including vector databases, feature stores and MLOps pipelines Experience processing and More ❯
to build the financial ecosystem of the future. We're evolving towards: A Cloud-Native, DevOps-First Culture - Moving towards a fully cloud-hosted, automated platform built with Kubernetes, Kafka, and Infrastructure as Code (IaC). A Real-Time Financial Ecosystem - Shifting from data at rest to data in motion, embracing event-driven architecture to power the real-time … rewrite-our goal is to transition to a more scalable, cloud-native architecture while maintaining stability. As a Senior Software Engineer, you'll be working with .NET, Azure, and Kafka, ensuring that applications are optimised for cloud deployment and aligned with modern development practices. You'll assess the right .NET versioning strategy, manage dependency transitions (e.g., moving from MSMQ … to Kafka), and ensure smooth migration of libraries and frameworks. Along with modernisation, you'll also be working on new features rollout in the existing system or new projects. What You'll Be Doing: Coaching and mentoring a team of 2-4 engineers. Modernising and containerising existing applications for Azure deployment. Build new cloud native event driven applications. Ensuring More ❯
Wimbledon, England, United Kingdom Hybrid / WFH Options
Morson Edge (Financial Services)
you’ll have the chance to influence technical decisions and shape how new features are delivered. Tech environment Java 17, Spring, Spring Boot Reactive programming with Akka Microservices architecture Kafka, JMS, RabbitMQ, ActiveMQ Docker/Kubernetes, AWS, MongoDB What we’re looking for Solid hands-on experience with Java 11+ (ideally Java 21) Good understanding of Reactive Programming concepts … with Akka, WebFlux, RxJava etc. Experience working with microservices and distributed systems Background with Kafka and JMS related messaging tools Knowledge of multithreading, concurrency, and performance tuning Familiarity with AWS and MongoDB Previous experience in financial services, ideally within payments Why join Work on systems used by major global banks Be part of a highly collaborative, engineering-driven culture More ❯
Locations: San Francisco, San Jose. Experience, qualification, and soft skills, have you got everything required to succeed in this opportunity Find out below. At EY, we’re all in to shape your future with confidence. We’ll help you succeed More ❯
end to end. Your Profile Technical Skills (Proven experience of delivering “responsibilities” against following Tech Stack) Working experience on Linux and Windows Operating System. Working knowledge of IBM MQ, Kafka, Oracle, Ms SQL Server Has configuration, management and support experience of Apache Web Servers and Application Servers – JBOSS, Websphere. Implementation knowledge of Chef, Ansible, Firewall, PKI, Cyber wallet and More ❯
Qualifications: Strong solution architect with AI experience and Java hands on experience should have great Java knowledge, so she/he writes backend code of the application. Knowledge of Kafka, Azure cloud and API management is required as well. Experience with data modelling on a conceptual, logical and physical level, preferably with Enterprise Architecture, ARIS and Erwin. Strong data More ❯
San Francisco, California, United States Hybrid / WFH Options
esrhealthcare
seamless care experience. What Will You Need? 5+ years of backend engineering experience using Node.js, TypeScript, and PostgreSQL. Deep experience with event-driven systems and real-time data processing (Kafka, RabbitMQ, webhooks). Strong grasp of distributed system architecture and messaging infrastructure. Bonus points if you've worked with LLMs or in AI-integrated environments. Security-forward mindset experience More ❯
profiling, journey maps, and human-centric design. Strong knowledge and understanding of different types of business and operating models (existing, new, emerging and hybrid). Experience with DataBricks, Confluent Kafka, PowerBI, Tableau, Business Objects and SAP ERP, supporting data needs for highly available e-commerce applications is preferred. Knowledge of Supply chain, logistics and e-commerce. Enabling streamlined development More ❯
consume from Kafka. Service decommissions. BAU work. Job Requirements Must Have Skillsets Java 17, 21 Junit 5, Cucumber Spring Boot 3, 4 Tooling (Gradle, GitHub, Nexus, Artifactory, Sonar, Jacoco) Kafka Kubernetes, Helm Databases (Preferably HBase, PostgreSQL) Jenkins and CICD Nice to Have Skills Kafka Streams, Data sinks OCP RabbitMQ Observability (Datadog, Grafana, Logging concepts and traces, Logback) AWS More ❯
driven architecture. Collaborate with engineering, product, and DevOps teams to ensure seamless integration and deployment of backend services. Own architectural decisions for systems involving SQL/NoSQL databases, Redis, Kafka, Solace, and other backend technologies. Lead initiatives for backend modernization, microservices migration, and platform scalability. Ensure security, compliance, and maintainability in backend systems. Evaluate and recommend new technologies, frameworks … or Node.js. Deep expertise in cloud architecture, especially with Microsoft Azure (App Services, AKS, Functions, Storage, Service Bus, etc.). Proven experience in event-driven architectures, using technologies like Kafka, Solace, or Azure Event Hubs. Solid understanding of microservices, API design, REST/gRPC, and service meshes. Strong experience with both SQL (e.g., SQL Server, PostgreSQL) and NoSQL databases More ❯
He should have implantation experience in ELK • If candidate has just done monitoring of existing ELK setup , that will not be useful. • Candidate should have exposure to cloud and Kafka • Also should have exposure to security domain • Strong understanding of elastic - elastic search , Kibana , logstash , Fleet and other integrations • Data Engineering skill set to design and develop pipelines to More ❯