organizations. BS/MS in Computer Science (or equivalent industry experience). Expert in Java, Spring Boot, RESTful microservices, and event-driven architectures (Kafka, Avro). Strong frontend acumen with React, JavaScript/TypeScript, and API-first integration (OpenAPI). Deep familiarity with Kubernetes orchestration, container security, and infrastructure … as code. Technology Stack Backend: Java, Spring Boot, Entity Framework, REST Frontend: React, Angular, JavaScript/TypeScript Data & Messaging: Oracle, PostgreSQL, Kafka, Avro Infrastructure & DevOps: Kubernetes (PKS), Docker, Helm, Terraform APIs & Modeling: API-first (OpenAPI), UML Languages: English C1 What can we offer? ️ 23 days of Annual Leave plus More ❯
Duration: 06+ Months Job Summary: We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing. … using tools like Ansible, Terraform, or Cloudera Manager. -Provide technical leadership and mentorship to junior team members. Required Skills: -Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams). -Experience with Cloudera distribution for Kafka on on-premises environments. -Proficiency in designing high-volume, low-latency … data pipelines. -Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc. -Experience with data serialization formats like Avro, JSON, Protobuf. -Proficient in Java, Scala, or Python for Kafka-based development. -Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.). -Understanding of networking, security (SSL More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ
Worked On Recently That Might Give You a Better Sense Of What You’ll Be Doing Day To Day Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data … into a common data format using Avro Schemas and JOLT Designing a proxy in-front of our Kubernetes cluster egress to whitelist traffic and mitigate any security risks Implementing Access/Role Based Access Control in ElasticSearch Writing a React UI using an ATDD approach with Cypress Right Now More ❯
Salford, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data … into a common data format using Avro Schemas and JOLT Designing a proxy in-front of our Kubernetes cluster egress to whitelist traffic and mitigate any security risks Implementing Access/Role Based Access Control in ElasticSearch Writing a React UI using an ATDD approach with Cypress Right now More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data … into a common data format using Avro Schemas and JOLT Designing a proxy in-front of our Kubernetes cluster egress to whitelist traffic and mitigate any security risks Implementing Access/Role Based Access Control in ElasticSearch Writing a React UI using an ATDD approach with Cypress Right now More ❯
worked on recently that might give you a better sense of what you'll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data … into a common data format using Avro Schemas and JOLT Designing a proxy in-front of our Kubernetes cluster egress to allowlist traffic and mitigate any security risks Implementing Access/Role Based Access Control in ElasticSearch Writing React UI using an ATDD approach with Cypress Improving docker-compose More ❯
MS in Computer Science (or equivalent industry experience). • Technical Mastery o Expert in Java, Spring Boot, RESTful microservices, and event-driven architectures (Kafka, Avro). o Strong frontend acumen with React, JavaScript/TypeScript, and API-first integration (OpenAPI). o Deep familiarity with Kubernetes orchestration, container security … world-class engineering culture. Technology Stack • Backend: Java, Spring Boot, Entity Framework, REST • Frontend: React, Angular, JavaScript/TypeScript • Data & Messaging: Oracle, PostgreSQL, Kafka, Avro • Infrastructure & DevOps: Kubernetes (PKS), Docker, Helm, Terraform • APIs & Modeling: API-first (OpenAPI), UML Languages: • English C1 • Spanish C2 What can we offer? ️ 23 days More ❯
datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation. … Data Engineering – Proficiency in building and optimising data pipelines and working with large-scale datasets. File Formats & Storage – Hands-on experience with Parquet, ORC, Avro, and efficient S3 storage solutions. DevOps & Automation – Experience with Terraform, CloudFormation, or CDK to automate infrastructure deployment. Security & Compliance – Familiarity with AWS Security Lake More ❯
Join us as an Automation Tester in Private Bank & Wealth Management (PBWM) to design and implement test cases, execute testing strategies to validate functionality, and drive innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings More ❯
knowledge of warehousing and ETLs. Extensive knowledge of popular database providers such as SQL Server, PostgreSQL, Teradata and others. • Proficiency in technologies in the Apache Hadoop ecosystem, especially Hive, Impala and Ranger • Experience working with open file and table formats such Parquet, AVRO, ORC, Iceberg and Delta Lake More ❯