and understand system interfaces and data flows across our enterprise. Design and maintain clear, business-aligned data models. Build and support integrations using AWS services (Lambda, Step Functions, S3, MSK/Kinesis, API Gateway). Ensure data transformations are well documented, traceable, and consistent. Collaborate with product owners, architects, and business stakeholders to clarify requirements and deliver solutions. Troubleshoot More ❯
engineers, architects, and DevOps teams to deliver robust streaming solutions. Must have skills: Extensive years of hands-on experience with ApacheKafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.) Strong proficiency in Java, Python, or Scala Solid understanding of event-driven architecture and data streaming patterns Experience deploying Kafka on cloud platforms such as AWS, GCP, or More ❯
with DevOps tools such as Jenkins, Bitbucket, Nexus, Git, Jira, etc. Experience working with a broad range of AWS services, including: API Gateway, Lambda, ECS, Elastic Load Balancers, EC2, MSK, RDS (Oracle preferred) Experience with virtual server hosting (EC2), container management (Kubernetes, ECS, EKS), and Linux OS administration. Solid understanding of cloud networking concepts, including VPCs, subnets, peering, firewalls …/GitLab, and Git workflows. Experience working in controlled environments, such as banking or financial services. Hands-on experience with Docker and at least one container orchestration platform (Amazon ECS/EKS or Kubernetes). Relevant AWS certifications (e.g., AWS Certified DevOps Engineer, AWS Solutions Architect). Key Responsibilities Design, deploy, and maintain cloud infrastructure on AWS using More ❯