solving skills Comfort working within a globally-distributed team A background in some of the following a bonus: Java experience Python experience Ruby experience Big data technologies: Spark, Trino, Kafka Financial Markets experience SQL: Postgres, Oracle Cloud-native deployments: AWS, Docker, Kubernetes Observability: Splunk, Prometheus, Grafana For more information about DRW's processing activities and our use of job More ❯
Python, and familiarity with ML frameworks like TensorFlow or PyTorch . You have exposure to cloud platforms (e.g., AWS, GCP), containerization (Docker, Kubernetes), and scalable data systems (e.g., Spark, Kafka). You are experienced or interested in ML model serving technologies (e.g., MLflow , TensorFlow Serving) and CI/CD tools (e.g., GitHub Actions). You understand ML algorithms and More ❯
competencies to the proficiency level appropriate to the seniority of the role; JAVA programming experience as the core language, including Spring Boot. Fluent in writing JAVA coding. Experience with Kafka or a similar platform. Experience with Scala and Spark. Integrating with backing services, such as PostgreSQL, Redis or S3 Good engineering practices including continuous delivery, clean code, documentation, defensive More ❯
Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of Data virtualization Production management (L3 support) experience Any Beneficial/Nice to have skills and More ❯
Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of Data virtualization Production management (L3 support) experience Any Beneficial/Nice to have skills and More ❯
Python, and familiarity with ML frameworks like TensorFlow or PyTorch . You have exposure to cloud platforms (e.g., AWS, GCP), containerization (Docker, Kubernetes), and scalable data systems (e.g., Spark, Kafka). You are experienced or interested in ML model serving technologies (e.g., MLflow , TensorFlow Serving) and CI/CD tools (e.g., GitHub Actions). You understand ML algorithms and More ❯
Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of Data virtualization Production management (L3 support) experience Any Beneficial/Nice to have skills and More ❯
cloud orchestration and containerisation technologies, such as Kubernetes and Docker Knowledge of large-scale databases and distributed technologies, such as RDS Knowledge of distributed event streaming technologies, such as Kafka Write/maintain automation script using Bash/Shell or python scripting languages Experience of mentoring and leading people around you and taking point on key decisions Aker Systems More ❯
low-latency applications, including advanced performance tuning techniques and profiling to achieve micro seconds level response times. Distributed Systems & Middleware: Hands-on experience working with various messaging middlewares (e.g., Kafka, Solace) and TCP-based communication protocols for building resilient and scalable distributed systems. Linux & Scripting Acumen: Strong working knowledge of the Linux operating system, including shell scripting and an More ❯
contracts, gRPC etc) to co-design solutions with our other engineers Working knowledge of microservices architecture Working knowledge of distributed event stream platforms for high-performance data pipelines (e.g. Kafka etc) What's in it for you 401k matching + Competitive equity package Excellent Medical, Dental and Vision health benefits Fertility & Family Forming Benefits Flexible time off Lunch, snacks More ❯
scripting, or equivalent. Hands-on experience with cloud platforms (e.g., AWS, Azure, or GCP) including services such as Lambda, S3, Azure Functions, or BigQuery. Experience with event-driven architectures (Kafka, Pub/Sub). Understanding of financial data domains such as transactions, risk, fraud, or customer 360 Familiar with the Power BI/Tableau for building governance dashboards. Proficient More ❯
scripting, or equivalent. Hands-on experience with cloud platforms (e.g., AWS, Azure, or GCP) including services such as Lambda, S3, Azure Functions, or BigQuery. Experience with event-driven architectures (Kafka, Pub/Sub). Understanding of financial data domains such as transactions, risk, fraud, or customer 360 Familiar with the Power BI/Tableau for building governance dashboards. Proficient More ❯
and successes to team members and Product Owners. Has experience of people management or desire to manage individuals on the team Nice to Have Experience with some of these- Kafka, Kinesis, Kinesis Analytics, Glue, Lambda, Kubernetes, ETL pipelines, BigQuery, Dataflow, BigTable, and SQL Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and More ❯
Nice to haves: Experience with Java Experience with Play framework Experience with web frameworks, or web development Experience with eCommerce Experience with event-driven architectures, preferably using RabbitMQ or Kafka Experience in using production AWS infrastructure, ideally with Terraform Additional Information PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work More ❯
or similar functional languages Proven experience building and operating scalable distributed systems and services within AWS cloud infrastructure or similar technologies. Designed and implemented distributed, event-driven systems using Kafka Streams, AWS Kinesis, or similar. Optimize for low-latency and high-throughput processing (1M+ TPS) microservices. Implemented auto-scaling , blue-green deployments , and canary releases andBuild and maintain SLAs More ❯
at senior levels and from the highly technical to non-technical Tech Stack M&S uses a variety of technologies across Fulfilment systems, including: Java, Micronaut, GraphQL ReactJS, Next.js Kafka, MongoDB Azure Cloud, Terraform, Dynatrace (observability) Everyone's welcome We are ambitious about the future of retail. We're disrupting, innovating and leading the industry into a more conscientious More ❯
at senior levels and from the highly technical to non-technical Tech Stack M&S uses a variety of technologies across Fulfilment systems, including: Java, Micronaut, GraphQL ReactJS, Next.js Kafka, MongoDB Azure Cloud, Terraform, Dynatrace (observability) Everyone's welcome We are ambitious about the future of retail. We're disrupting, innovating and leading the industry into a more conscientious More ❯
Barnard Castle, County Durham, United Kingdom Hybrid / WFH Options
myworkdayjobs.com - ATS
Days Ago Staff Software Engineer - Commercial Engineering (REMOTE) remote type Remote locations 5 Locations time type Full time posted on Posted 30 Days Ago Staff Engineer - PaaS Messaging/Kafka (Remote) remote type Remote locations 35 Locations time type Full time posted on Posted 4 Days Ago time left to apply End Date: June 24, 2025 (2 days left More ❯
comfortable making key decisions Powerful collaborator who works well across departments Our stack AWS as our cloud compute platform Kubernetes (EKS) for container runtime and orchestration RDS (PostgreSQL, MySQL), Kafka, Redis Terraform for infrastructure as code Lambda and Step Functions Datadog for Observability Github actions for CICD Frontend is React Backend services are developed in NodeJS (TypeScript) As we More ❯
or similar service (drone, chef, etc.) • Storage and Caching: Postgres, Cassandra, Redis, and Elasticsearch • Languages: Java, Groovy, Kotlin, Javascript and Python • Frameworks: Spring Boot, Camel and Praxis • Data Movement: Kafka, Benthos • Metrics Visualization: Grafanna About you: • 4 year degree or equivalent experience • 5+ years of software development experience • Demonstrates strong domain-specific knowledge regarding Target's technology capabilities, and More ❯
analytics domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, Google Cloud (GCP), Kafka, Redis, VM's, Lucene. Occasional travel may be required. Bonus Points: Deep understanding of Elasticsearch and Lucene, including Elastic Certified Engineer certification Experience working closely with a pre-sales More ❯
analytics domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, Google Cloud (GCP), Kafka, Redis, VM's, Lucene. Occasional travel may be required. Bonus Points: Certifications and specialization in Data Science, Data Analytics, Data Engineering, Machine Learning, NLP, Data Infrastructure, analytics Deep understanding More ❯
Engineer , you'll help re-architect our mission-critical Event Processing System (EPS) -the backbone of parcel and letter tracking-into a scalable, event-driven platform using technologies like Kafka , containerised microservices , and multi-cloud infrastructure . What You'll Do Design and deliver scalable, resilient backend systems. Shape cloud-ready, service-aligned architecture. Champion engineering best practices (TDD … Passion for clean, maintainable code and infrastructure as code. Excellent communication and mentoring skills. Bonus Points For Spring Boot, Go, or Node.js experience. Cloud expertise (AWS, Azure, GCP). Kafka, RabbitMQ, or event-driven architecture. GitOps, Docker/Kubernetes, Terraform. Why Join Us? Be part of a multi-year transformation shaping the future of Royal Mail's digital platforms. More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Stott and May
to London or Bristol Pay: negotiable, INSIDE IR35 Responsibilities: - Design, implement robust ETL/ELT data pipelines using Apache Airflow - Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS - Develop and maintain data lakes and warehouses (AWS S3, Redshift) - Ensuring governance using automated testing tools - Collaborate with DevOps to manage CI/CD pipelines for data More ❯