Microservices. Ability to architect and design scalable and reliable Java microservices to meet business requirements. Desired Skills Experience IBM Digital Core Solution Experience with Kafka and Confluent Experience with Java Message Service, Core Java, Spring Core, Spring MVC, Spring Batch, Java, Microservices, Spring Boot, AWS services Security Clearance Active More ❯
Docker, Kubernetes, or related container platforms. Experience with cloud-native software deployment, ideally on AWS or Kubernetes. Experience with message bus technologies such as Kafka or AMPS. Experience developing open-source or internal libraries integrated into applications by other teams. Experience working on high-throughput, mission-critical, high-performance More ❯
create scalable multi-region applications Experience with the following technologies: AWS Serverless and serverless-supported languages DB (ex, Postgresql, CockroachDB) Streaming technologies (Amazon Kinesis, Kafka) Message Queues (SQS, RabbitMQ, ActiveMQ) Experience working with agile development Experience with containerization (Docker, Kubernetes, etc) Experience with Micro-Service and Service oriented architectures More ❯
NICE-TO-HAVE Spring Batch Strong programming skills in using Core and Advance Java, Object Oriented Design principles. Experience with Message Queues welcome (MQ, Kafka nice to have) Strong knowledge of designing and developing highly scalable, distributed data solutions and integration with other middleware and UI applications/solutions More ❯
threaded programming, cloud computing, scientific computing and object-oriented design. Frameworks and libraries include Agile, Amazon Web Services (AWS), AWS Lambda, AWS SQS/Kafka, Traditional Web Services, ActiveMQ, and the Microsoft Office suite. Operation systems include Linux, Windows/NT. Microservice architecture implementation. Familiarity with complete SW engineering More ❯
time. Travel is usually to our San Antonio, TX customers. Preferred Requirements A current Secret or TS/SCI is preferred. Experience with NiFi, Kafka, AWS Infrastructure, and K8's. Experience in cloud based technologies (AWS, Azure). Experience in distributed databases, NoSQL databases, full text-search engines (e.g. More ❯
including data analysis, extraction, transformation, and loading, data intelligence, data security and proven experience in their technologies (e.g. Spark, cloud-based ETL services, Python, Kafka, SQL, Airflow) You have experience in assessing the relevant data quality issues based on data sources & uses cases, and can integrate the relevant data More ❯
Java is the core skillset we look for, you will also gain exposure to the following: Kotlin (Java) Typescript Angular Websockets GCP AWS Kubernetes Kafka PostgreSQL MongoDB ElasticSearch Redis Snowflake Databricks What we're looking for: We're looking for Senior Java Developers who are passionate about what they More ❯
C#/.Net in Azure Cloud Tech Stack - One or more of these technical skills C#/.Net NodeJS Web API Messaging ( Rabbit or Kafka) Kubernetes/Docker AWS/Azure GIT Terraform ORMs eg Entity Framework/NHibernate SQL Document DB REST SOLID TDD Agile/SCRUM Dev More ❯
software architectures & networking, or microservice architectures. Experience with observability tools like Grafana, Prometheus, Open Telemetry and others. Experience with streaming architectures and tools (e.g. Kafka) About the Team J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations More ❯
teams having multiple levels of engineering. Strong experience in Java 17+, ideally with experience in using Spring Boot or Micronaut. Hands-on knowledge of Kafka, event-driven architecture, enterprise integration patterns and microservices. Solid understanding of payment systems and PSP integrations, including fraud, tokenisation, and settlement flows. Experience with More ❯
cloud services, including Azure Functions, Azure Synapse pipelines and AWS cloud services. Strong understanding of modern data platforms and services (e.g., cloud data services, Kafka, Spark) and data modeling techniques. Proficient in software development with experience in at least one major programming language (e.g., Python, Java, etc.). Experience More ❯
Contribute to the Automated Test and Re-Test (ATRT) product suite Develop software interfaces to communicate across network protocols such as REST, DDS, ActiveMG, Kafka Implement visualizations to support analytic results Prepare, track, and deliver contractual deliverables Minimum Required Qualifications: Bachelor's Degree in computer science, software engineering, computer More ❯
team, using Agile methodologies. Strong experience in developing with the following technologies: Java - version 11 and later. Spring, Spring boot and Spring Security JPA Kafka Connect Azure Cloud REST APIs The ODATA protocol OAUTH authentication JUNIT Mokito Karate Cucumber Docker Kubernetes Postman Good experience in using Structured Query Language More ❯
microservices purposed programming languages Runtimes: Experience in .NET Core, .Net based development Proficiency in Azure cloud services and frameworks Developing Event Driven Applications using Kafka, Zookeeper Knowledge of cloud-native CI/CD for change release Knowledge in Test Driven Development Azure Full stack developers need to be proficient More ❯
Linux scripting Experience with Elastic Cloud or other cloud-based Elasticsearch deployments. Experience with Elasticsearch security features. Experience with other data processing technologies (e.g., Kafka, Spark). Experience with containerization technologies (e.g., Docker, Kubernetes). Elastic certifications. Original Posting: April 29, 2025 For U.S. Positions: While subject to change More ❯
Colorado Springs, Colorado, United States Hybrid / WFH Options
HII Mission Technologies
to adapt to schedule changes as needed. Must be able to travel CONUS up to 25% of the time. Preferred Requirements Experience with NiFi, Kafka, AWS Infrastructure, and K8's. Experience in cloud based technologies (AWS, Azure). Experience in distributed databases, NoSQL databases, full text-search engines (e.g. More ❯
Kubernetes. Hands-on experience with AWS services (EC2, RDS, ElastiCache, etc.). Experience with relational databases (i.e. PostgreSQL). Experience with messaging services (i.e. Kafka, ActiveMQ, Camel Route, etc.). Experience with remote debugging (JDWP). Experience working within Linux. Experience with Jira, Confluence, and GIT.Desired Qualifications: Experience working More ❯
Colorado Springs, Colorado, United States Hybrid / WFH Options
Metronome LLC
to adapt to schedule changes as needed. Must be able to travel CONUS up to 25% of the time. Desired Skills Experience with NiFi, Kafka, AWS Infrastructure, and K8's. Experience in cloud based technologies (AWS, Azure). Experience in distributed databases, NoSQL databases, full text-search engines (e.g. More ❯
and tools, including GitLab CI/CD and Jenkins Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience leading a team of AI and ML engineers, researchers, and data scientists to develop and deploy advanced AI and ML technologies in autonomous More ❯
experience on RESTful APIs - interconnected software components interaction, engineering and testing (e.g. NMS applications, controllers, orchestrators, supervisory systems, etc.). Experience and understanding of Kafka messaging bus. Experience in using monitoring tools like Nagios, Grafana, Prometheus and Kibana is desired. Deployment environment: Kubernetes, Docker, microservices. Experience on Talos Kubernetes More ❯
Saltstack), CI/CD (Gitlab, Jenkins, Git), monitoring and visualization (Prometheus, Grafana) Experience with big data technologies such as NoSQL/RDBMS, Redis, ElasticSearch, Kafka Experience with containers and container management (Docker, Kubernetes) Experience analysing and building data telemetry, modelling, pipelines, UI visualization Experience in developing software, troubleshooting, and More ❯
etc.) NoSQL databases (e.g. MongoDB, Neo4j, Redis, etc) Spark or other distributed big data systems (e.g. Hadoop, Pig, Hive, etc.) Stream-processing frameworks (e.g. Kafka) Data pipeline orchestration tools (e.g. Airflow, Prefect, Dagster, etc.) Job Requirements: Bachelor's/Master's in Computer Science, a related field, or equivalent More ❯