Data Management At the core of VAULT is big data at scale. Our systems handle massive ingestion pipelines, long-term storage, and high-performance querying. We leverage distributed technologies (Kafka, Spark, Flink, Cassandra, Airflow, etc.) to deliver resilient, low-latency access to trillions of records, while continuously optimizing for scalability, efficiency, and reliability. We'll trust you to: Build … high-performance, distributed data pipelines and services Lead technical direction and ensure alignment with business needs Collaborate across teams using modern open-source tech (Kafka, FastAPI, Airflow, Spark, etc.) Mentor junior engineers and contribute to an inclusive team culture You will need to have: Bachelor's in Computer Science, Engineering, or related field (or equivalent experience) 4+ years of … an object-oriented programming language Deep background in distributed, high-volume, high-availability systems Fluency in AI development tools We would love to see: Experience with big data ecosystems (Kafka, Spark, Flink, Cassandra, Redis, Airflow) Familiarity with cloud platforms (AWS, Azure, GCP) and S3-compatible storage SaaS/PaaS development experience Container technologies (Docker, Kubernetes) Salary Range More ❯
developers and career growth Proficient in SDLC including design, coding, review, source control, deployment, and operations Awesome If You: Have experience with Rust/Java/Kotlin Know AWS, Kafka, Spark, Flink, or Beam, especially deployment and monitoring Have productized ML research projects Are familiar with Airflow, Kubernetes, data lake systems, and file formats like Parquet, Orc, Athena Possess … relevant AWS or Kafka certifications Enjoy working in a diverse, innovative environment Are motivated by collaboration, learning, and exploring new tools Value company ownership and responsible business practices Appreciate flexible working policies and employee benefits like health insurance and volunteering More ❯
graph queries. Skills Requirements: • Expert with Java and Spring Boot; proficient using them to build enterprise scale applications. • Experience building real-time data processing applications using streaming libraries like Kafka Streams. • Understanding of common Enterprise Integration Patterns (EIP) and how to apply them. • Experience with service containerization and deployment using Docker and/or Kubernetes. • Experience with Extract, Transform … Familiarity with Git and GitLab CI/CD. Nice to Haves: • Experience with graph databases such as Neo4j. • Experience building real-time data processing applications using streaming libraries like Kafka Streams. • Experience modeling data and relationships in graph databases. • Experience with networking concepts, protocols, and analysis (routers, switches, etc.). • Knowledge of SIGINT collection and analysis systems. Benefits Overview More ❯
of functional programming so you can expect to join a team that is applying principles from FP, Reactive Programming and Distributed Computing to build these services, using Scala, Akka, Kafka, Play and Cats, as well a wide range of cloud-native technologies including AWS (Kinesis, DynamoDB, Lambda), Docker and Serverless. We have a mature DevOps culture in place, where … the team is responsible for the infrastructure and deployment of those applications - "You build it, you run it." What you will do: You will be using Scala, Akka, Kafka, Kinesis, and Dynamo to build and innovate our software that is distributed, reactive, and scalable. You will: Contribute to or lead a significant part of the implementation and deployment of More ❯
telecommuting from a home-based office in a hybrid work model. Primary Responsibilities: Design & develop log solutions technologies with a key focus on Google SecOps, BindPlane, Beats, LogStash and Kafka Build, design and develop new Log Collection systems for on-prem and cloud environments, AWS, Azure & GCP Build and support pipeline monitoring and alerting tools like GCP Monitoring and … experience Proven experience working within log collection setup and development Proven experience with RHEL Linux Server OS Experience working with Google SecOps basic search queries Experience with BindPlane, LogStash, Kafka, and GitHub Operations Proficiency in Scripting/Programming with Python and Go Preferred Qualifications: Proven ability to work on high and low-level designing Experience with Cloud Solution designs More ❯
expanding automation opportunities across the SDLC. Our tech stack includes C++ and python, comdb2 and postGres for databases, Cassandra for NoSql large volume storage, Redis for Caching, RabbitMQ and Kafka for messaging, and FIX protocol for electronic trading communication. We'll trust you to: Help drive the architecture design and strategy for the Enterprise Reporting engineering organization Develop and … and automated testing. We'd love to see: Prior contributions to system design and architecture and scaling fault-tolerant, distributed systems. Familiarity with distributed messaging technologies such as RabbitMQ, Kafka, IBM MQ. Familiarity with XML, JSON, FIX, SWIFT, SFTP. Financial knowledge. Salary Range = 160000 - 240000 USD Annually + Benefits + Bonus The referenced salary range is based on the More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Fusion People
Senior Java Developer Job type: Contract - 12 months Rate: £700 Umbrella Location: Leeds (hybrid) x5 vacancies available We are looking for an outstanding Java Developer, who can make a positive impact, and wants to contribute to the most demanding and More ❯
Software Engineer II (PHP/Golang) - Remote page is loaded Software Engineer II (PHP/Golang) - Remote Apply remote type Remote locations Poland - Remote United Kingdom - Remote Spain - Remote time type Full time posted on Posted 11 Days Ago job More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
and distributed system design for one of the most ambitious financial technology projects in the world. Why apply? Greenfield engineering - modern tech stack: Java, Spring Boot, Kubernetes (EKS & GKE), Kafka, Apache Ignite, MongoDB, RabbitMQ, Solace Massive scale - billions of risk calculations per day across trading desks globally Impact & visibility - shape the technical direction while mentoring global engineering teams Hybrid … Proven background in distributed systems, event-driven architecture, and high-performance platforms Strong leadership skills - able to influence, mentor, and set technical strategy Experience with messaging and data technologies (Kafka, RabbitMQ, MongoDB, Apache Ignite) If you're interested in shaping a truly global, cloud-native platform while remaining close to the engineering, we'd love to hear from you. More ❯
of open-source technology and multi-cloud solutions. Our vision is to become the trusted Data & AI Platform for everyone, leveraging the most popular open-source technologies like ApacheKafka, Aiven for PostgreSQL, Aiven for Clickhouse, and Aiven for OpenSearch. to help companies accelerate time-to-market, drive efficiency, and build innovative solutions across any cloud. Right now, we … and post-sales capacity. Suitable candidates must have proven experience in one or more of the following areas that will be required in their day-to-day job: ApacheKafka, Apache Cassandra, Clickhouse, PostgreSQL, MySQL, OpenSearch, or Redis. The position is full-time and permanent, located in London. What You'll Do: Help customers succeed, building successful relationships from … various Linux utilities and tools (installation, securing system, storage, etc.) and a good grasp of networking basics like DNS A relatively deep conceptual and hands-on understanding of ApacheKafka, Apache Cassandra, Clickhouse, PostgreSQL, MySQL, OpenSearch, or Redis. Act as a subject matter expert on Aiven service offerings Attention to detail and good problem-solving and project management skills More ❯
including data prep and labeling to enable data analytics. • Familiarity with various log formats such as JSON, XML, and others. • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions) • Ability to decompose technical problems and troubleshoot both system and dataflow issues. • Must be certified DoD IAT II or higher (CompTIA Security+ … Experience with Java, including unit and integration testing. • Python: Experience with Python is desired. • SQL: Familiarity with SQL schemas and statements. Tools and Technologies: • Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS. • Version Control and Build Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience More ❯
Title:-Java Developer with angular Location:-McLean, VA (Onsite) Skills: Java 17, Kafka , spring boot, Microservices, AWS, Angular JD details :- 4+ years of Core Java development experience, familiar with Java 17 or higher. Able to deliver end to end project. Experience with java development and aware of java patterns. Experience in Spring Boot and Microservices uses for development. Experience … of development project and developed functionalities using java, j2ee. Must have AWS experience. (link removed) Must have Kafka experience AWS services (S3, SQS, SNS,ECS, Lambda, AWS Cloud watch etc Experience on AWS deployed application. Agile/scrum development experience. Good collaboration and communication skills. Ready to learn new technologies and perform POC on it. Ability to diagnose prod More ❯
and features Use big data technologies (e.g. Spark, Hadoop, HBase, Cassandra) to build large scale machine learning pipelines Develop new systems on top of real-time streaming technologies (e.g. Kafka, Flink) 5+ years software development experience 5+ years experience in Java, Shell, Python development Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) is a plus Experience … in Cassandra, HBase, Flink, Spark or Kafka is a plus. Experience in the Spring Framework is a plus Experience with test-driven development is a plus Must be located in Ireland More ❯
give back to the open source community. Work as part of a distributed team of remote workers across timezones. What You'll Use: Go (Golang) Python AWS Postgres ElasticSearch Kafka Kubernetes Many external and internal APIs What You'll Need : Degree in Computer Science (or commensurate experience). Experience with Golang or another language for developing web backends and … web-services with data processing pipelines and the concepts required. Experience with relational and noSQL databases ( Redis , Postgres , Cassandra , ElasticSearch a plus). Understanding of messaging or queueing software, Kafka experience helpful but not required. Linux skills and experience with large-scale, business-critical Linux environments. Understanding of distributed systems and scalability challenges, particularly in Cloud environments such as … architecture. Bonus Points: Authored and led successful open source libraries and projects. Contributions to the open source community (GitHub, Stack Overflow, blogging). Existing exposure to Go, AWS, Cassandra, Kafka, Elasticsearch Prior experience in the cybersecurity or intelligence fields. Bring your experience in distributed technologies and algorithms, your great API and systems design sensibilities, and your passion for writing More ❯
backbone of high-impact, secure, and scalable systems. Key Responsibilities: - Design modern cloud-native systems using AWS, microservices, and event-driven architecture. - Lead containerised deployments using Kubernetes and manage Kafka-based real-time data systems. - Ensure cloud environments follow best practices in security, compliance, and performance. - Mentor technical teams and drive strategic architectural decisions. - Optimise infrastructure through CI/… CD, Infrastructure as Code, and cloud cost management. Key Skills & Experience: - Strong AWS background with cloud security expertise (IAM, VPC, encryption, etc.) - Deep knowledge of Kubernetes, Docker, Kafka, and Java backend systems. - Experience with CI/CD, Terraform/CloudFormation, and automated deployments. - Familiarity with industry standards such as GDPR, HIPAA, and ISO 27001 is a plus. - Eligible for More ❯
Ripjar specialises in the development of software and data products that help governments and organisations combat serious financial crime. Our technology is used to identify criminal activity such as money laundering and terrorist financing, and enables organisations to enforce sanctions More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Ripjar
Ripjar specialises in the development of software and data products that help governments and organisations combat serious financial crime. Our technology is used to identify criminal activity such as money laundering and terrorist financing, and enables organisations to enforce sanctions More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Ripjar
Ripjar specialises in the development of software and data products that help governments and organisations combat serious financial crime. Our technology is used to identify criminal activity such as money laundering and terrorist financing, and enables organisations to enforce sanctions More ❯
Job Title: Head of Engineering Location: 2-3 days on site in Cambridge Salary: 130k + benefits (salary depending on experience, may be some flex for the right person). This is a rare opportunity to help shape the foundation More ❯
Cambridge, Impington, Cambridgeshire, United Kingdom
SoCode Limited
Job Title: Head of Engineering Location: 2-3 days on site in Cambridge Salary: £130k + benefits (salary depending on experience, may be some flex for the right person). This is a rare opportunity to help shape the foundation More ❯
Who We Are: Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series C funding round brought our total investment to over More ❯
lead engineer, you’ll be hands-on in both build and support, working across security-first infrastructure and enabling robust DevSecOps practices. Essential Skills: Strong experience with AWS MSK (Kafka) and secure multi-account AWS environments Proven production experience with Kubernetes, Docker, Helm Proficient in Terraform, CI/CD Pipelines (Drone/GitLab) Excellent understanding of Kafka internals … stream processing, and secure Kafka deployments Strong experience across monitoring (Prometheus, Grafana, CloudWatch) Knowledge of security hardening, IAM, WAF, Shield, Vault Working knowledge of Agile, Infrastructure-as-Code, and DevSecOps practices UK*C or Enhanced DV (eDV) Clearance is a must To Be Considered: Please either apply through this advert or email me directly at: luke.parry@searchability.com. For further More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Searchability NS&D
lead engineer, you’ll be hands-on in both build and support, working across security-first infrastructure and enabling robust DevSecOps practices. Essential Skills: Strong experience with AWS MSK (Kafka) and secure multi-account AWS environments Proven production experience with Kubernetes, Docker, Helm Proficient in Terraform, CI/CD Pipelines (Drone/GitLab) Excellent understanding of Kafka internals … stream processing, and secure Kafka deployments Strong experience across monitoring (Prometheus, Grafana, CloudWatch) Knowledge of security hardening, IAM, WAF, Shield, Vault Working knowledge of Agile, Infrastructure-as-Code, and DevSecOps practices UK*C or Enhanced DV (eDV) Clearance is a must To Be Considered: Please either apply through this advert or email me directly at: luke.parry@searchability.com. For further More ❯
london, south east england, united kingdom Hybrid / WFH Options
Searchability NS&D
lead engineer, you’ll be hands-on in both build and support, working across security-first infrastructure and enabling robust DevSecOps practices. Essential Skills: Strong experience with AWS MSK (Kafka) and secure multi-account AWS environments Proven production experience with Kubernetes, Docker, Helm Proficient in Terraform, CI/CD Pipelines (Drone/GitLab) Excellent understanding of Kafka internals … stream processing, and secure Kafka deployments Strong experience across monitoring (Prometheus, Grafana, CloudWatch) Knowledge of security hardening, IAM, WAF, Shield, Vault Working knowledge of Agile, Infrastructure-as-Code, and DevSecOps practices UK*C or Enhanced DV (eDV) Clearance is a must To Be Considered: Please either apply through this advert or email me directly at: luke.parry@searchability.com. For further More ❯