practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired Skills: Experience with containerization technologies (e.g. Docker Kubernetes). Experience with data streaming platforms (e.g. Kafka Kinesis). Experience with data visualization and business intelligence tools. Experience with Agile development methodologies. AWS Certifications (e.g. AWS Certified Data Analytics - Specialty AWS Certified Solutions Architect) About Capgemini More ❯
Preferred Qualifications AWS Certification (Solutions Architect – Associate/Professional) is a plus. Experience with Infrastructure as Code (IaC) using Terraform or AWS CDK. Exposure to data streaming platforms like Kafka is a bonus. Background in performance optimization, scalability, and security best practices. Desired Skills: Knowledge of AI and Emerging Technologies (Good to have) Understanding of how AI capabilities can More ❯
Now Hiring: Senior Java Developer (Kafka) – 12-Month FTC (rolling with years of work available - up to £120k base Location: London (On-Site 4 days per week) Initial 12-month fixed-term contract rolling with years of work available A leading global trading firm is hiring a Senior Java Engineer with strong Kafka expertise to work on a … live with its first electronic venue and is now scaling fast across multiple markets and asset classes. What You’ll Be Working With Java 17 & Spring Boot 3 ApacheKafka for high-volume data streams Vertica & MSSQL Kubernetes (on-prem orchestration) Microservices architecture, containerised environments Agile delivery with strong autonomy and ownership Your Impact Build and scale real-time … classes with complex financial products Contribute to a system designed to run with minimal dev involvement post go-live What We’re Looking For Proven commercial experience with ApacheKafka in high-throughput systems Deep Java expertise (Java 17 preferred) Strong understanding of microservices, distributed systems, and containers Background in financial services or trading platforms is a major plus More ❯
Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical … Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools. Note : This is a senior-level More ❯
Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical … Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools #J-18808-Ljbffr More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Adalta
that satisfy the functional and non-functional requirements. Coordinate deployments with Release Manager and monitor for successful integration. Key Skills : Python - Django - React - Docker - Microservices - Kubernetes - Rabbit MQ - ApacheKafka - SQL - PostgreSQL - REST API - AWS - Performance Tuning. Apply directly for consideration and if chosen for the shortlist, a member of the team will be in touch to discuss. More ❯
Reading, England, United Kingdom Hybrid / WFH Options
Mastek
Eligible or Active SC Candidates may Apply Hybrid Work Model Reading, UK base location with some travel to client location AWS services - IAMS, WAFS, SQSBYOK, Helm, Kubernetes, Docker, Ingress, Kafka, Elastic Search, Pega experience is a MUST The DevOps engineer will be required to contribute to version-controlled configuration assets within a DevOps team’s existing fully automated continuous … tooling (Git/GitLab, Jenkins, Ansible, Terraform, Linux, AWS EC2, S3 and EKS are essential) with Squid Proxy, NGINX with AWS services - IAMS, WAFS, SQSBYOK, Helm, Kubernetes, Docker, Ingress, Kafka, Elastic Search, Pega exp must. Key responsibilities include: Design and implement automated build and deployment solutions for Java-based microservice applications utilizing Atlassian Jira/GitLab/Jenkins/… Experience with Enterprise Jenkins to create reusable pipelines across projects. DevOps Tools: GitLab, Jenkins, Ansible, Terraform, JMeter, Squid Proxy, NGINX, AWS services - IAMS, WAFS, SQSBYOK, Helm, Kubernetes, Docker, Ingress, Kafka, Elastic Search, Pega experience is a MUST. Understanding of the Linux Operating System, standard network protocols and security hardening. Proven experience using AWS Cloud Solutions and services such as More ❯
with GraphQL or gRPC Exposure to monitoring/logging tools (e.g., CloudWatch, ELK, Prometheus) Knowledge of security best practices in API and cloud development Familiarity with data streaming using Kafka or Kinesis #J-18808-Ljbffr More ❯
Experience with Microservice Architecture (Docker, Kubernetes, Helm, Spring Boot, Java, Python) Strong proficiency in AWS cloud services, including EC2, Lambda, S3, RDS, API Gateway, and IAM Experience working with Kafka for event-driven and streaming architectures Hands-on experience with API development, RESTful principles, and API management platforms Proven track record of delivering projects using agile methodologies (Scrum, Kanban More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Experience with Microservice Architecture (Docker, Kubernetes, Helm, Spring Boot, Java, Python) Strong proficiency in AWS cloud services, including EC2, Lambda, S3, RDS, API Gateway, and IAM. Experience working with Kafka Hands-on experience with API development, RESTful principles, and API management platforms. Proven track record of delivering projects using agile methodologies (Scrum, Kanban, etc.). Benefits: Annual Bonus More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Halian
Terraform, CloudFormation), and CI/CD pipelines Experience with relational and NoSQL databases (e.g., SQL Server, PostgreSQL, DynamoDB) Familiarity with event-driven architecture and messaging systems (e.g., SNS, SQS, Kafka) Proven track record working in FinTech or financial services environments Nice to Have: Frontend experience with modern frameworks (e.g., React, Angular) Familiarity with containerization (Docker) and orchestration (Kubernetes, ECS More ❯
Software development practices such as TDD, OOP, SOLID principles, GIT workflows and Build Pipelines. Microservices, event-based systems and distributed architectures. Tools like Kubernetes, Docker and message queues (e.g., Kafka, RabbitMQ). Databases such as MongoDB, SQL, Cassandra and Elasticsearch. Excited? Curious about us and ready to introduce yourself? Click the apply button! Do you have any questions? Feel More ❯
productivity with proven experience in elevating the same at scale. Demonstrated recent hands-on engineering experience with core GL&B platform technologies such as Go, Java, JavaScript, Docker, K8s, Kafka, Elastic, Couchbase, and/or Postgres and restful, event driven, and microservices technologies for large scale environments. Ability to roll up sleeves and be hands on; while at the More ❯
productivity with proven experience in elevating the same at scale. Demonstrated recent hands-on engineering experience with core GL&B platform technologies such as Go, Java, JavaScript, Docker, K8s, Kafka, Elastic, Couchbase, and/or Postgres and restful, event driven, and microservices technologies for large scale environments. Ability to roll up sleeves and be hands on; while at the More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Rightmove PLC
systems and third-party applications using modern integration platforms (e.g., MuleSoft, Boomi, Azure Logic Apps). Developing features and improvements to our internal applications using Java, Spring Boot, Elasticsearch, Kafka, Gradle, Hibernate, Couchbase, SQL, Docker. Collaborating with internal contacts and the team to find solutions to complex problems. Contributing to key decisions, including which features to include during team More ❯
Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical … Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools #J-18808-Ljbffr More ❯
In-depth experience with container orchestration and runtime environments, including Kubernetes and Docker; strong understanding of microservices architecture, relational/non-relational databases, and middleware components such as ApacheKafka and NGINX Demonstrated expertise in deploying and integrating diverse open-source and cloud-native technologies across distributed systems Qualifications Bachelor's degree or higher in Computer Science, Software Engineering More ❯
growing business. Requirements: Strong experience with Python (2+ Years) Experience working with REST Microservices Strong SQL skills Experience handling very large data sets Knowledge of big data tools (Spark, Kafka, etc.) Experience in finance (Preferred) Strong formal education, ideally in Computer Science If interested, please apply ASAP! Interviews are happening immediately! #J-18808-Ljbffr More ❯
Collaborate with Technical Leads, Architects and business stakeholders Guide delivery of software engineering and platform capability Experience: Experience of leading enterprise level Data Platforms using Azure, AWS, Snowflake and Kafka Experience of end to end data integration patterns and enterprise grade data modelling. Strong understanding of SDLC principles Current or previous coding experience in Python, Java, R or similar More ❯