React , Angular , or Vue.js for full-stack development is a plus. Event-Driven Architecture: Experience with event-driven architectures or message queuing systems (e.g., Kafka , RabbitMQ ) is beneficial. Education: A degree in Computer Science , Engineering , or a related field is preferred but not required. More ❯
to translate business and research needs into engineering roadmaps. Collaborative mindset, capable of interfacing with multi-disciplinary teams. Nice to Have: Experience with Redis, Kafka, or similar real-time streaming/data platforms. Familiarity with financial APIs, FIX protocol, or trading exchange integrations. Education: Bachelor’s or Master’s More ❯
london, south east england, united kingdom Hybrid / WFH Options
Trireme
to translate business and research needs into engineering roadmaps. Collaborative mindset, capable of interfacing with multi-disciplinary teams. Nice to Have: Experience with Redis, Kafka, or similar real-time streaming/data platforms. Familiarity with financial APIs, FIX protocol, or trading exchange integrations. Education: Bachelor’s or Master’s More ❯
Docker, Kubernetes), RESTful APIs, microservices architecture, and big data technologies (Hadoop, Spark, Flink). Knowledge of NoSQL databases (MongoDB, Cassandra, DynamoDB), message queueing systems (Kafka, RabbitMQ), and version control systems (Git). Preferred Skills: Experience with natural language processing libraries such as NLTK, spaCy, or Hugging Face Transformers. Familiarity More ❯
continuous learning and staying up-to-date with emerging technologies and industry best practices. Nice-to-Have Qualifications: Familiarity with message brokers such as Kafka, or AWS SQS for handling asynchronous communication. Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation for managing cloud infrastructure. Knowledge of More ❯
in any of the following technologies is a plus: Languages & Frameworks: Java, Spring Boot, TypeScript, JavaScript, React Databases: PostgreSQL, SQL Server, MongoDB Messaging & Integration: Kafka, ActiveMQ, REST, API Gateways, WSO2 Cloud & DevOps: AWS, Docker, Ansible, Jenkins Search & Analytics: Elasticsearch Culture & Team Engineers are supported by a collaborative, inclusive team More ❯
london, south east england, united kingdom Hybrid / WFH Options
Anson McCade
in any of the following technologies is a plus: Languages & Frameworks: Java, Spring Boot, TypeScript, JavaScript, React Databases: PostgreSQL, SQL Server, MongoDB Messaging & Integration: Kafka, ActiveMQ, REST, API Gateways, WSO2 Cloud & DevOps: AWS, Docker, Ansible, Jenkins Search & Analytics: Elasticsearch Culture & Team Engineers are supported by a collaborative, inclusive team More ❯
Required Expertise: Extensive and deep expertise in application architecture using modern technologies such as cloud native development, 12 factor Apps, microservices, serverless, API management, Kafka, etc. Deep knowledge of Microservices, Containers, REST APIs development, API Management tools (e.g. MuleSoft, Apigee), Kafka. Solution architect with broad expertise in a wide More ❯
Skills: Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. More ❯
familiar with, and able to install, configure and manage various persistence technologies, including database technologies (NoSQL/SQL) and broker/queuing systems (e.g. Kafka, SQS), including knowledge of HA/clustering. You are comfortable with various logging, monitoring and alerting platforms and have expertise in the usage (and More ❯
creating thread-safe concurrent code in Java or another JVM based language Hands-on experience with event driven architecture and distributed messaging technologies (e.g. Kafka ) Experience with Docker and running Production workloads on Kubernetes Experience using and designing schemas/data structures in SQL and NoSQL databases (e.g. Oracle More ❯
with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS More ❯
DevOps Engineers. As a Senior Java Developer, you will: Have 5+ years experience as a Software Engineer Experience developing with: Java, Spring Boot, Microservices, Kafka (or other messaging queues e.g. RabbitMQ), AWS, Docker, Kubernetes A desire to be part of an important mission Be adaptable to working in the More ❯
london, south east england, united kingdom Hybrid / WFH Options
Understanding Recruitment
DevOps Engineers. As a Senior Java Developer, you will: Have 5+ years experience as a Software Engineer Experience developing with: Java, Spring Boot, Microservices, Kafka (or other messaging queues e.g. RabbitMQ), AWS, Docker, Kubernetes A desire to be part of an important mission Be adaptable to working in the More ❯
Spring Boot, Spring Cloud) Expertise in cloud platforms (Google Cloud preferred, AWS or Azure also relevant) Experience with microservices architecture and event-driven systems ( Kafka, RabbitMQ ) Proficiency in containerisation (Docker, Kubernetes) Familiarity with CI/CD pipelines and infrastructure automation This is an exciting role for a proactive developer More ❯
a production environment. Some experience with relational and non-relational databases. Qualifications/Nice to have Experience with a messaging middleware platform like Solace, Kafka or RabbitMQ. Experience with Snowflake and distributed processing technologies (e.g., Hadoop, Flink, Spark More ❯
development experience Experience of designing distributed systems, microservices, micro-frontend UIs Experience of using cloud services such as AWS and distributed systems such as Kafka, Kubernetes, S3, DynamoDB, MongoDB or any other NoSQL database Experience of following TDD and passionate about clean code principles Proficiency in programming in Java More ❯
with DevOps principles and tooling such as Infrastructure as Code (Terraform) and CI/CD (GitHub Actions, Jenkins) Knowledge of stream processing technologies like Kafka would be useful Experience working with ITSM systems like JSM, Zendesk or ServiceNow Experience building/maintaining automated incident management workflows Experience developing with More ❯
or other advanced analytics infrastructure. Familiarity with infrastructure-as-code (IaC) tools such as Terraform or CloudFormation. Experience with modern data engineering technologies (e.g., Kafka, Spark, Flink, etc.). Why join YouLend? Award-Winning Workplace: YouLend has been recognised as one of the "Best Places to Work 2024" by More ❯
team dedicated to making a meaningful impact within the retail industry while staying at the forefront of technological advancements. We support and run multiple Kafka clusters, Confluent managed and self-managed in AWS. We provide self-service pipelines to allow engineers to use the Kafka clusters via GitHub … actions. We have a number of services running in Kubernetes to manage the cluster, Metrics, Kafka-Connect, Schema Registry, and GUI Tools. We keep track of everything using Git, Jira & Confluence, with daily stand-ups, and empowered teamwork! About the role: Managing and running a Kafka Platform, Confluent … products. Assist in architecting systems and products. Visualise your product's behaviour in production. What we're looking for: Required: In depth knowledge of Kafka, Kafka eco-system and administration of a Kafka platform. Enthusiastic and motivated team player. Be open to learning other languages and platforms More ❯
skills to convey information and results clearly Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Experience with event messaging frameworks like ApacheKafka The hiring range for this position in Santa Monica, California is $136,038 to $182,490 per year, in Glendale, California is More ❯
governance (GDPR, privacy regulations) Build and automate CI/CD pipelines for efficient data operations Work with real-time data processing tools such as Kafka, AWS Kinesis, or Azure Stream Analytics Partner with data analysts, product teams, and engineers to ensure seamless data accessibility Research emerging technologies to improve More ❯