with minimal supervision Desirable Requirements: Experience with: CI/CD pipelines setup and config Document-based/No-SQL databases The ELK stack ApacheKafka Spring Security and OAuth2 flows React React Native Keycloak Camunda OpenAPI/Swagger Nginx setup and config The company offers fantastic benefits including; Hybrid more »
Java components using JUnit or similar testing tools. Collaborate with the existing development team to ensure seamless integration with our current technology stack, including Kafka or RabbitMQ for messaging. Implement and maintain database solutions with PostgreSQL or other relational databases, and Redis Utilize Docker Compose for defining and running more »
Senior Java/Kafka Engineer – Java 11/17, Kafka, Spring Boot, Microservices, Messaging, Hedge Fund - 12 month Fixed Term Contract A Senior Java Developer with strong knowledge of the Kafka platform and in-depth experience of real-time data streaming is sought after by a leading … a critical technology modernisation programme. You will join a small team whose focus will be on the implementation of both modern microservice architecture and Kafka for a new trade feed system. As a Senior Engineer you will help drive the design and setup of this new system, acting as … an SME within the team for Kafka (working across the full product suite covering KTables, KSQL, KConnect, KStreams etc.). You will also be involved in helping them migrate to their greenfield Docker/Kubernetes container platform, mentoring and upskilling Junior team members as well as liaising with the more »
datasets, and transitioning towards event-driven architecture Utilizing Java 17+ with Spring/Spring Boot 3+, SQL/Oracle Effective experience with Redis, ApacheKafka, Flink or similar Striving for Serverless solutions utilizing Linux, virtualization, containers, docker, Kubernetes, potentially in production environments, but at least for testing purposes Production more »
with minimal supervision Desirable Requirements: Experience with: CI/CD pipelines setup and config Document-based/No-SQL databases The ELK stack ApacheKafka Spring Security and OAuth2 flows React React Native Keycloak Camunda OpenAPI/Swagger Nginx setup and config If this role would be of interest more »
Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience, such more »
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications more »
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications more »
Cloud Platform. Strong problem-solving skills and ability to think critically. Excellent communication and leadership skills. Deep knowledge and use of integration technologies e.g. kafka, rabbitMQ, zeroMQ etc Accountability and the ability to work effectively both independently and collaboratively in a fast-paced environment. Understanding of Unix-family systems more »
Your Profile Key skills/knowledge/experience: in building complex integration using nodejs, net, ms-sql, java, camel, springboot/spring cloud, iis, kafka, elastic stack, ocp and k8s. and skills in server level programming around apis/microservices/memory-management/messaging/event-sourcing. on more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Distributed Processing: Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. more »
with Python (2+ Years) Experience working with REST Microservices Strong SQL Experience working with very large data sets. Knowledge of big data tools (Spark, Kafka etc) Experience working in finance (Preferred) Strong formal education - ideally in Computer Science If this sounds of interest, then please do not hesitate to more »
central London office - the 2 days you work are totally up to you! Essential Skills: Java (ideally 8 or 11) Experience within Finance Blockchain Kafka Kubernetes CI/CD and relevant tooling (Jenkins) TDD Docker Desirable: Kotlin Blockchain technologies Benefits: Salary up to £130 000 pa Private medical Discretionary more »
on a hybrid basis (twice a week), this can be flexible and negotiable if required. Tech skills needed: Java 17+ (this is essential) AWS Kafka Springboot Docker/Kubernetes Devops & CI/CD pipelines Microservices TDD approach API gateways/Kong The role would be perfect for someone who more »
a focus on Spring Boot Expertise in cloud development (ideally AWS) Knowledge and ideally hands-on experience with data streaming, event-based architectures and Kafka Strong communication and interpersonal skills Experience with Apache Spark or Apache Flink would be ideal, but not essential Please note, this role is unable more »
formats like delta lake, apache iceberg or apache hudi. - Experience of developing and managing real time data streaming pipelines using Change data capture (CDC), Kafka and Apache Spark. - Experience with SQL and database management systems such as Oracle, MySQL or PostgreSQL. - Strong understanding of data governance, data quality, data more »
moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: ApacheKafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data, document formats more »
their London office on a hybrid basis, twice a week ideally but this can be flexible. Tech skills needed: Java 17+ Springboot AWS (Lambda) Kafka Microservices Sequence diagrams Docker/Kubernetes Architectural design experience Interviews for this role will be taking place ASAP so if you like the sound more »
sees payments. They're seeking a skilled Java Engineer to help create meaningful solutions using modern Java related technologies such as Spring Boot, Microservices, Kafka, Kubernetes, Docker all while harnessing the power of AWS. Key Responsibilities : Develop and maintain Java applications with Spring Boot and microservices. Collaborate with cross more »
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard Ltd
for you. They're seeking a skilled Java Engineer to help create meaningful solutions using modern Java related technologies such as Spring Boot, Microservices, Kafka, Kubernetes, Docker all while harnessing the power of AWS. Key Responsibilities : Develop and maintain Java applications with Spring Boot and microservices. Collaborate with cross more »
key technologies are C# (.NET Core) applications hosted using Kubernetes on AWS. We also use a number of other key technologies including Python, RabbitMQ, Kafka, Elasticsearch and Concourse for CI/CD. As we work in an informal, agile environment we are also looking for candidates who are genuinely more »
and software design Travel up to 10% What Will Help You On The Job Familiarity with running software services at scale AWS Infrastructure, Airflow, Kafka and data streaming using Spark/Scala Understanding of networking fundamentals (OSI layers 2-7) Technical and software engineering background in the areas of more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Scala and Spark. Experience with data visualization tools. Familiarity with cloud platforms (GCP preferred, but AWS or Azure also acceptable). Knowledge of Kubernetes, Kafka, and other relevant technologies. Previous work on data-driven projects. Why this is a great contract: Opportunity to work for one of the UKs more »
. In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent ETL skills, Data Modeling Skills Ability to define the monitoring, alerting, deployment strategies for various services. Good more »