of open-source technology and multi-cloud solutions. Our vision is to become the trusted Data & AI Platform for everyone, leveraging the most popular open-source technologies like ApacheKafka, Aiven for PostgreSQL, Aiven for Clickhouse, and Aiven for OpenSearch. to help companies accelerate time-to-market, drive efficiency, and build innovative solutions across any cloud. About this Team More ❯
across trading support systems, infrastructure, and execution tools, with strong autonomy and end-to-end ownership. What They're Looking For Key Skills: Strong Java experience (Java 17 preferred) Kafka - Power User level only (critical requirement) Microservices experience is a nice-to-have Background & Experience: Deep, production-grade Kafka experience is non-negotiable 6-8+ years of … On-site technical deep dive ( 2-2.5 hours) with the wider team Feedback and offer turnaround expected within days If you're a seasoned Java developer with serious Kafka experience and want to work on a genuinely interesting greenfield project at a high-performance trading firm, this is well worth a conversation. Apply here or reach out directly. More ❯
up to date (state of the art), whether we're in the cloud deploying over a 1,000 microservices into AWS, Azure & GCP or streaming billions of messages on Kafka and building event-based solutions. This is where you can broaden your technical knowledge and help solve complex problems while using your Agile skills to develop our long-lived More ❯
teams Expert-level proficiency in Python and hands-on experience with at least one LLM based framework (LangChain, LangGraph, LangSmith, LlamaIndex, Qdrant, etc ) Strong experience with asynchronous queues (e.g., Kafka, RabbitMQ) and asynchronous APIs Deep understanding of cloud infrastructure (AWS, GCP) and experience deploying and managing applications at scale. Strong understanding of data lake architectures, including experience with data More ❯
/Kubernetes). Experience working in environments with AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What you can expect from us: Opportunity for annual bonuses Medical Insurance Cycle to work scheme Work from More ❯
with Data visualisation techniques and tools Familiarity with structured and unstructured storage of Data Use of Code repositories, familiarity with branching strategies, pull requests and merge processes. MongoDB ApacheKafka Infrastructure as Code (e.g Terraform, Ansible) Responsibilities Design and build software using industry best practice Collaborate with stakeholders and other engineers Contribute to the completion of milestones associated with More ❯
how to create scalable multi-region applications Experience with the following technologies: o AWS Serverless and serverless-supported languages o DB (ex, Postgresql, CocroachDB) o Streaming technologies (Amazon Kinesis, Kafka) o Message Queues (SQS, RabbitMQ, ActiveMQ) Experience working within an agile development Experience with containerization (Docker, Kubernetes, etc) Experience with Micro-Service and Service oriented architectures Experience with Core More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Caesars Entertainment
teams to deliver customer-centric products that are easy to use and customers love. What You Will Need: 4+ years of development experience with micro-service architectures (Java, Spring, Kafka, AWS, RabbitMQ, SQL, Docker, Python) Experience as a software engineer working within a high growth product tech environment Proficiency in Java and familiarity with the Spring Framework ( Springboot , specifically … Experience utilizing AWS (Amazon Web Services) and building/maintaining APIs Experience with OLTP systems, Kafka and NRT messaging would be beneficial Willingness to collaborate and mentor junior team members. Familiarity with Agile software development methodologies. Strong problem-solving mind-set with the ability to work in a fast-paced environment Enthusiasm for learning new technologies and developments intechnology More ❯
ETL, data warehousing, and database platforms (e.g. SQL Server, PostgreSQL). Strong communicator with experience working across technical and business teams. Desirable: Experience with big data tools (e.g. Spark, Kafka) and BI tools (e.g. Power BI). Relevant certifications (e.g. Azure Data Engineer, CDMP). If you are interested please email your CV to for immediate consideration More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: ApacheKafkaApache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or emailing me … SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
Central London, London, United Kingdom Hybrid / WFH Options
AWD Online
CD pipelines across Azure API architecture Strong experience designing, implementing, and securing RESTful and/or gRPC APIs Real-time data systems Skilled in message streaming with platforms like Kafka, Azure Event Hubs, or RabbitMQ Leadership in fast-paced environments Proven track record as a tech lead in high-growth or startup environments, with the ability to build, mentor More ❯
may have a bit of flexibility on this and very flexible hours. Tech Stack: Java 17 Spring Boot AWS Microservices ML/Big Data- desirable Docker/Kubernetes Messaging - Kafka Responsibilities: Design and development of low latency software components Contribute across the entire SDLC Write and enhance well structured, testable, scalable and efficient code Setup and maintain development process More ❯
providing input in defining test strategy, and participating in sprint refinement sessions. Ideal Experience Experience with Microservice Architecture (Docker, Kubernetes, Helm, Spring Boot, Java, Python). Experience working with Kafka and Domain Driven Design. Experience with designing low latency, reliable (fault tolerant), scalable, secure, and cost-effective solutions on a cloud native platform. Experience with Relational and Nonrelational databases … services. Ability to read and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batch processing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology More ❯
of open-source technology and multi-cloud solutions. Our vision is to become the trusted Data & AI Platform for everyone, leveraging the most popular open-source technologies like ApacheKafka, Aiven for PostgreSQL, Aiven for Clickhouse, and Aiven for OpenSearch. to help companies accelerate time-to-market, drive efficiency, and build innovative solutions across any cloud. Who we are More ❯
junior developers and analysts. Key Skills and Experience Required Extensive Core Java experience with strong knowledge of data structures, design patterns, and SOLID principles. Experience with messaging systems like Kafka and Solace. Proficiency in Spring framework and cloud technologies (Docker/Kubernetes/OpenShift). Familiarity with Jira, Bitbucket, and Gradle. Experience with document databases (MongoDB). Experience developing More ❯
serverless architectures (Lambda, API Gateway, etc.) Kubernetes and container orchestration PostgreSQL and other relational databases Spark and distributed data processing Typescript and Java programming languages Infrastructure as Code (Terraform) Kafka and message queuing systems Git/GitHub, including GitHub Actions for CI/CD About Chainalysis Blockchain technology is powering a growing wave of innovation. Businesses and governments around More ❯
/CD pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new More ❯
in developing production applications. Data Engineering & Streaming: Familiarity with data streaming tools and Elastic to ensure high performance in data-driven environments. Nice to Have: Neo4j Data Streaming e.g. Kafka Nvidia Triton Octopus Deploy Langsmith Elastic About Definely Named in the top 25 of the prestigious Deloitte UK Technology Fast50 in 2023, as well as the EMEA Technology Fast More ❯
/Kubernetes). Experience working in environments with AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What you can expect from us: Salary 65-75k Opportunity for annual bonuses Medical Insurance Cycle to More ❯
Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting skills in Python. Knowledge of Event-Driven Architectures (Kafka). Collaborate with DevOps teams to implement CI/CD pipelines and infrastructure as code using tools like Terraform, CloudFormation, and Ansible. Implement and manage monitoring and observability tools More ❯
/CD pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new More ❯
Looking For Core Skills Containers & Orchestration: Strong expertise in container security and Kubernetes (multi-cluster/global deployment is a plus). Distributed Systems & Messaging: Knowledge of clusters, storage, Kafka, Aeron, and experience with multicast or HPC. Automation & IaC: Proficiency in Python, Golang, or Rust with experience in IaC tools and immutable infrastructure. Continuous Delivery & Config Management: Familiarity with More ❯
Looking For Core Skills Containers & Orchestration: Strong expertise in container security and Kubernetes (multi-cluster/global deployment is a plus). Distributed Systems & Messaging: Knowledge of clusters, storage, Kafka, Aeron, and experience with multicast or HPC. Automation & IaC: Proficiency in Python, Golang, or Rust with experience in IaC tools and immutable infrastructure. Continuous Delivery & Config Management: Familiarity with More ❯
teamwork, high-quality software that drives impact, we run our own services. Our current technical stack includes: backend services are containerised and provisioned into ECS clusters, service communication via Kafka (Confluent Cloud), infrastructure automation with Pulumi (Typescript), our infrastructure is hosted at AWS (most used: ECS, S3, DynamoDB, Aurora, OpenSearch), Github Actions for builds and workflow automation, DataDog for More ❯