following: Delivering integration micro-service patterns using Java Spring Boot Applying SOLID principles and creating clean code Experience working with streaming data (e.g. ApacheKafka) and AWS native messaging/streaming features API specification/design and documentation Working with SQL and NoSQL data sources such as Postgres and more »
Kafka Specialist working with a Financial Services Company for an initial 6-month contract , this will be paying up to £1,000 per day (inside IR35). This is a Hybrid position, 2-3 days per week in the London office. Job Summary We are looking for a Kafka Specialist, with good experience with Confluent Kafka platform. This position will be responsible for building and maintaining high performant Confluent Kafka platform both on-prem (Linux) and Cloud (AWS). Main Responsibilities Maintain critical Confluent Kafka Platform - eg infrastructure upgrades, vulnerability fixes patches. Build and maintain … with DevOps teams to implement and manage Kafka deployment using Harness or other similar deployment tools. Required Experience In-Depth Experience with ApacheKafka and confluent platform. In-Depth Experience with Kafka hosted on-prem - Linux. In-Depth Experience with Kafka hosted off-prem - AWS cloud. more »
with minimal supervision Desirable Requirements: Experience with: CI/CD pipelines setup and config Document-based/No-SQL databases The ELK stack ApacheKafka Spring Security and OAuth2 flows React React Native Keycloak Camunda OpenAPI/Swagger Nginx setup and config The company offers fantastic benefits including; Hybrid more »
Java components using JUnit or similar testing tools. Collaborate with the existing development team to ensure seamless integration with our current technology stack, including Kafka or RabbitMQ for messaging. Implement and maintain database solutions with PostgreSQL or other relational databases, and Redis Utilize Docker Compose for defining and running more »
desirable skills include: Experience deploying, securing and supporting cloud infrastructure platforms Understanding of security frameworks/standards Understanding of data streaming and messaging frameworks (Kafka, Spark, etc.) and modern database technologies (Cockroach etc.) Understanding of distributed tracing and monitoring (Zipkin, OpenTracing, Prometheus, ELK stack, Micrometer metrics, etc.) Experience with more »
/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc. * Experience and knowledge more »
Senior Java/Kafka Engineer – Java 11/17, Kafka, Spring Boot, Microservices, Messaging, Hedge Fund - 12 month Fixed Term Contract A Senior Java Developer with strong knowledge of the Kafka platform and in-depth experience of real-time data streaming is sought after by a leading … a critical technology modernisation programme. You will join a small team whose focus will be on the implementation of both modern microservice architecture and Kafka for a new trade feed system. As a Senior Engineer you will help drive the design and setup of this new system, acting as … an SME within the team for Kafka (working across the full product suite covering KTables, KSQL, KConnect, KStreams etc.). You will also be involved in helping them migrate to their greenfield Docker/Kubernetes container platform, mentoring and upskilling Junior team members as well as liaising with the more »
datasets, and transitioning towards event-driven architecture Utilizing Java 17+ with Spring/Spring Boot 3+, SQL/Oracle Effective experience with Redis, ApacheKafka, Flink or similar Striving for Serverless solutions utilizing Linux, virtualization, containers, docker, Kubernetes, potentially in production environments, but at least for testing purposes Production more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, Apache Airflow, Apache Iceberg, Apache Spark, Apache Hudi, ApacheKafka, Power BI, BigQuery, Azure ML is a plus Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB Experience with Agile, DevOps methodologies Awareness and more »
with minimal supervision Desirable Requirements: Experience with: CI/CD pipelines setup and config Document-based/No-SQL databases The ELK stack ApacheKafka Spring Security and OAuth2 flows React React Native Keycloak Camunda OpenAPI/Swagger Nginx setup and config If this role would be of interest more »
Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience, such more »
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications more »
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications more »
Cloud Platform. Strong problem-solving skills and ability to think critically. Excellent communication and leadership skills. Deep knowledge and use of integration technologies e.g. kafka, rabbitMQ, zeroMQ etc Accountability and the ability to work effectively both independently and collaboratively in a fast-paced environment. Understanding of Unix-family systems more »
Your Profile Key skills/knowledge/experience: in building complex integration using nodejs, net, ms-sql, java, camel, springboot/spring cloud, iis, kafka, elastic stack, ocp and k8s. and skills in server level programming around apis/microservices/memory-management/messaging/event-sourcing. on more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Distributed Processing: Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. more »
with Python (2+ Years) Experience working with REST Microservices Strong SQL Experience working with very large data sets. Knowledge of big data tools (Spark, Kafka etc) Experience working in finance (Preferred) Strong formal education - ideally in Computer Science If this sounds of interest, then please do not hesitate to more »
central London office - the 2 days you work are totally up to you! Essential Skills: Java (ideally 8 or 11) Experience within Finance Blockchain Kafka Kubernetes CI/CD and relevant tooling (Jenkins) TDD Docker Desirable: Kotlin Blockchain technologies Benefits: Salary up to £130 000 pa Private medical Discretionary more »
on a hybrid basis (twice a week), this can be flexible and negotiable if required. Tech skills needed: Java 17+ (this is essential) AWS Kafka Springboot Docker/Kubernetes Devops & CI/CD pipelines Microservices TDD approach API gateways/Kong The role would be perfect for someone who more »
Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate excellent team more »
a focus on Spring Boot Expertise in cloud development (ideally AWS) Knowledge and ideally hands-on experience with data streaming, event-based architectures and Kafka Strong communication and interpersonal skills Experience with Apache Spark or Apache Flink would be ideal, but not essential Please note, this role is unable more »
formats like delta lake, apache iceberg or apache hudi. - Experience of developing and managing real time data streaming pipelines using Change data capture (CDC), Kafka and Apache Spark. - Experience with SQL and database management systems such as Oracle, MySQL or PostgreSQL. - Strong understanding of data governance, data quality, data more »
their London office on a hybrid basis, twice a week ideally but this can be flexible. Tech skills needed: Java 17+ Springboot AWS (Lambda) Kafka Microservices Sequence diagrams Docker/Kubernetes Architectural design experience Interviews for this role will be taking place ASAP so if you like the sound more »