Python, and familiarity with ML frameworks like TensorFlow or PyTorch . You have exposure to cloud platforms (e.g., AWS, GCP), containerization (Docker, Kubernetes), and scalable data systems (e.g., Spark, Kafka). You are experienced or interested in ML model serving technologies (e.g., MLflow , TensorFlow Serving) and CI/CD tools (e.g., GitHub Actions). You understand ML algorithms and More ❯
cloud orchestration and containerisation technologies, such as Kubernetes and Docker Knowledge of large-scale databases and distributed technologies, such as RDS Knowledge of distributed event streaming technologies, such as Kafka Write/maintain automation script using Bash/Shell or python scripting languages Experience of mentoring and leading people around you and taking point on key decisions Aker Systems More ❯
Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of Data virtualization Production management (L3 support) experience Any Beneficial/Nice to have skills and More ❯
cloud orchestration and containerisation technologies, such as Kubernetes and Docker Knowledge of large-scale databases and distributed technologies, such as RDS Knowledge of distributed event streaming technologies, such as Kafka Write/maintain automation script using Bash/Shell or python scripting languages Experience of mentoring and leading people around you and taking point on key decisions Aker Systems More ❯
low-latency applications, including advanced performance tuning techniques and profiling to achieve micro seconds level response times. Distributed Systems & Middleware: Hands-on experience working with various messaging middlewares (e.g., Kafka, Solace) and TCP-based communication protocols for building resilient and scalable distributed systems. Linux & Scripting Acumen: Strong working knowledge of the Linux operating system, including shell scripting and an More ❯
contracts, gRPC etc) to co-design solutions with our other engineers Working knowledge of microservices architecture Working knowledge of distributed event stream platforms for high-performance data pipelines (e.g. Kafka etc) What's in it for you 401k matching + Competitive equity package Excellent Medical, Dental and Vision health benefits Fertility & Family Forming Benefits Flexible time off Lunch, snacks More ❯
scripting, or equivalent. Hands-on experience with cloud platforms (e.g., AWS, Azure, or GCP) including services such as Lambda, S3, Azure Functions, or BigQuery. Experience with event-driven architectures (Kafka, Pub/Sub). Understanding of financial data domains such as transactions, risk, fraud, or customer 360 Familiar with the Power BI/Tableau for building governance dashboards. Proficient More ❯
scripting, or equivalent. Hands-on experience with cloud platforms (e.g., AWS, Azure, or GCP) including services such as Lambda, S3, Azure Functions, or BigQuery. Experience with event-driven architectures (Kafka, Pub/Sub). Understanding of financial data domains such as transactions, risk, fraud, or customer 360 Familiar with the Power BI/Tableau for building governance dashboards. Proficient More ❯
and successes to team members and Product Owners. Has experience of people management or desire to manage individuals on the team Nice to Have Experience with some of these- Kafka, Kinesis, Kinesis Analytics, Glue, Lambda, Kubernetes, ETL pipelines, BigQuery, Dataflow, BigTable, and SQL Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and More ❯
Nice to haves: Experience with Java Experience with Play framework Experience with web frameworks, or web development Experience with eCommerce Experience with event-driven architectures, preferably using RabbitMQ or Kafka Experience in using production AWS infrastructure, ideally with Terraform Additional Information PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work More ❯
or similar functional languages Proven experience building and operating scalable distributed systems and services within AWS cloud infrastructure or similar technologies. Designed and implemented distributed, event-driven systems using Kafka Streams, AWS Kinesis, or similar. Optimize for low-latency and high-throughput processing (1M+ TPS) microservices. Implemented auto-scaling , blue-green deployments , and canary releases andBuild and maintain SLAs More ❯
at senior levels and from the highly technical to non-technical Tech Stack M&S uses a variety of technologies across Fulfilment systems, including: Java, Micronaut, GraphQL ReactJS, Next.js Kafka, MongoDB Azure Cloud, Terraform, Dynatrace (observability) Everyone's welcome We are ambitious about the future of retail. We're disrupting, innovating and leading the industry into a more conscientious More ❯
at senior levels and from the highly technical to non-technical Tech Stack M&S uses a variety of technologies across Fulfilment systems, including: Java, Micronaut, GraphQL ReactJS, Next.js Kafka, MongoDB Azure Cloud, Terraform, Dynatrace (observability) Everyone's welcome We are ambitious about the future of retail. We're disrupting, innovating and leading the industry into a more conscientious More ❯
Barnard Castle, County Durham, United Kingdom Hybrid / WFH Options
myworkdayjobs.com - ATS
Days Ago Staff Software Engineer - Commercial Engineering (REMOTE) remote type Remote locations 5 Locations time type Full time posted on Posted 30 Days Ago Staff Engineer - PaaS Messaging/Kafka (Remote) remote type Remote locations 35 Locations time type Full time posted on Posted 4 Days Ago time left to apply End Date: June 24, 2025 (2 days left More ❯
comfortable making key decisions Powerful collaborator who works well across departments Our stack AWS as our cloud compute platform Kubernetes (EKS) for container runtime and orchestration RDS (PostgreSQL, MySQL), Kafka, Redis Terraform for infrastructure as code Lambda and Step Functions Datadog for Observability Github actions for CICD Frontend is React Backend services are developed in NodeJS (TypeScript) As we More ❯
analytics domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, Google Cloud (GCP), Kafka, Redis, VM's, Lucene. Occasional travel may be required. Bonus Points: Deep understanding of Elasticsearch and Lucene, including Elastic Certified Engineer certification Experience working closely with a pre-sales More ❯
analytics domain. Understanding and/or certification in one or more of the following technology Kubernetes, Linux, Java and databases, Docker, Amazon Web Service (AWS), Azure, Google Cloud (GCP), Kafka, Redis, VM's, Lucene. Occasional travel may be required. Bonus Points: Certifications and specialization in Data Science, Data Analytics, Data Engineering, Machine Learning, NLP, Data Infrastructure, analytics Deep understanding More ❯
Engineer , you'll help re-architect our mission-critical Event Processing System (EPS) -the backbone of parcel and letter tracking-into a scalable, event-driven platform using technologies like Kafka , containerised microservices , and multi-cloud infrastructure . What You'll Do Design and deliver scalable, resilient backend systems. Shape cloud-ready, service-aligned architecture. Champion engineering best practices (TDD … Passion for clean, maintainable code and infrastructure as code. Excellent communication and mentoring skills. Bonus Points For Spring Boot, Go, or Node.js experience. Cloud expertise (AWS, Azure, GCP). Kafka, RabbitMQ, or event-driven architecture. GitOps, Docker/Kubernetes, Terraform. Why Join Us? Be part of a multi-year transformation shaping the future of Royal Mail's digital platforms. More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Stott and May
to London or Bristol Pay: negotiable, INSIDE IR35 Responsibilities: - Design, implement robust ETL/ELT data pipelines using Apache Airflow - Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS - Develop and maintain data lakes and warehouses (AWS S3, Redshift) - Ensuring governance using automated testing tools - Collaborate with DevOps to manage CI/CD pipelines for data More ❯
stored procedures and patterns, preferably in SQL Server. Snowflake Database experience can be valuable and would help the team in the data migration process. Knowledge of Apache Flink or Kafka highly desirable or similar technologies (e.g. Apache Spark) Skills in C# WPF or Javascript GUI development beneficial, but not essential. Excellent communication skills. Mathematical. Finance industry experience, with an More ❯
compute, and storage services. Programming Prowess: Strong programming skills in Python and SQL are essential. Big Data Ecosystem Expertise: Hands-on experience with big data technologies like Apache Spark, Kafka, and data orchestration tools such as Apache Airflow or Prefect. ML Data Acumen: Solid understanding of data requirements for machine learning models, including feature engineering, data validation, and dataset More ❯
5+ years of experience building cloud native applications with AWS. Familiarity with Spring and Apache libraries and other large open source libraries. Experience with complex technology stacks. Experience with Kafka and real-time messaging systems. Minimum of 2 years' experience with Apache Flink. Deep understanding is desired. Expert SQL/Database Query experience required. Financial Services experience is desired. More ❯
Architect or similar senior role. Strong consulting background with full lifecycle delivery experience. Ability to lead architectural strategy and oversee implementation. Architectural Expertise Deep knowledge of event-driven architectures (Kafka, Confluent). Experience with data lakes/lakehouses (Databricks, Unity Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure More ❯
implements long-term improvements. Exposure to financial services or regulated environments is advantageous but not essential. Docker/Snowpark container services experience Streaming and related technologies such as ApacheKafka, Azure Event Hub Integration with Collibra Proficiency in Python Insight Investment is committed to being an inclusive employer and encourages applications from all suitably qualified applicants irrespective of background More ❯