implementation ofascalable, secure, reliable low latency trading infrastructureusingAWS cloud infrastructure. MUST HAVE: Hands-on experience designing and implementing AWS cloud solutions at scale inc. Kafka or similar data streaming components. Technical Implementation and Configuration: Hands-on implementation, configuration and management of infrastructure components (networking, cloud services, internal/external more »
modular Kubernetes-centric platform, with Pulumi, Terraform, and Argo. Implementing service mesh and configuration management for microservices. Operating critical infrastructure like Apache Pulsar or Kafka and Keycloak. Developing a multi-Cloud approach supporting Azure, Alibaba, and GCP. Implementing collection, dashboards, and alerts for logs and metrics. Enabling developers to more »
Python and SQL is essential. Developing low latency infrastructure using AWS cloud infrastructure. Hands-on experience in designing and implementing AWS cloud solutions eg. Kafka or similar data streaming components. Experience with AWS, VPN, VPC Peering, E2C, S3, Lambda, Aurora, Docker/Kubernetes in implementation and configuration is highly more »
Scala Developer to work on a hybrid basis. You will have a blend of the following skills: - RESTful microservices in Scala (Type level stack, Kafka, Kubernetes, GCP, AWS). Good working knowledge of Supporting - Akka - HTTP - Streams i Experience working in an Agile environment Pair Programming and TDD in more »
Scala Developer to work on a hybrid basis. You will have a blend of the following skills: - RESTful microservices in Scala (Type level stack, Kafka, Kubernetes, GCP, AWS). Good working knowledge of Supporting - Akka - HTTP - Streams i Experience working in an Agile environment Pair Programming and TDD in more »
Database, Data Lake Storage, Databricks, CosmosDB). Understanding of data modelling and governance principles. Experience with Python, Databricks, and ETL tools. Familiarity with Spark, Kafka, and Azure service integration. Skills in Power BI or Azure Data Explorer for visualisation. Knowledge of data security and compliance. Strong analytical and communication more »
in Leeds. Salary ; up-to £95,000 Full Stack Software Engineer Responsibilities: Develop web applications and systems using TypeScript, React, Go, NodeJS, Docker, AWS & Kafka Support continuous integration and deployment pipelines. Work closely with developers and architects to build scalable and secure systems. Ensure security, quality, and reliability through more »
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment more »
areas (breadth) and deep technical expertise in some of the below technologies: Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage Data visualization – Tools more »
Proficiency in cloud platforms (AWS, GCP, Azure) and containerization (Docker) is a game-changer. Familiarity with caching technologies (Redis) and messaging/streaming tech (Kafka, RabbitMQ) will set you apart. Why You Belong Here Join a fast-growing FinTech company with a vibrant start-up culture and global reach. more »
Knowledge and understanding of OTC products (Interest Rate Swaps, Variance Swaps, CDS, etc.) bookings. Familiarity with C++ and Big Data tools such as Spark, Kafka, Elastic. Join us and be part of a team that values innovation, collaboration, and excellence. Take your career to new heights with a leading more »
tasks, orchestration tools like Airflow. Understanding of cloud security principles, encryption techniques, and compliance standards (e.g., GDPR, HIPAA, SOC 2). Good to have: Kafka for building real-time data streaming pipelines and event-driven architectures. CI/CD tools such as Azure DevOps or Jenkins Agile methodologies such more »
My Client are looking to expand their growing Development team which collaborate, design and develop new user stories into production. Located within Fintech, My Client are emerging as a service and cultural leader within their industry, They are looking for more »
Experience: Strong proficiency in Java with a focus on Spring Boot Experience with relational databases and SQL Worked in an event based environment utilising Kafka (or RabbitMQ or ActiveMQ) Cloud development (ideally AWS) experience Please note, this role is unable to provide sponsorship. If this role sounds of interest more »
Senior Software Engineer or Technical Leads 💻 NodeJS, Kafka, Cassandra, React, AWS, Microservices 🏠 Hybrid - Zone 1 office 💵 £90k - £120k base salary, plus equity and annual bonus Evoke is working with a publicly traded technology unicorn business in the b2b SaaS space who are looking to rapidly grow the number of more »
based services and enhance efficiency. 🔹 Good communication skills in English. 🔹 Experience with model management tools (e.g., mlflow) and MLOps. Nice-to-Haves: 🔹 Experience with Kafka and Vector Databases. 🔹 Experience in deploying large language models (LLMs) at scale. If you are interested in this position, feel free to send over more »
priorities. Preferred Qualifications: Experience with financial trading systems or related domains. Knowledge of distributed computing and networking concepts. Familiarity with messaging frameworks such as Kafka or RabbitMQ. Experience with cloud platforms such as AWS or Azure. Please apply for immediate considerations. more »
management API gateways * Knowledge of microservice architecture and principles * Ability to design database integration solutions, supporting connectivity, integrity and security. Tech Skills: * Red hat * Kafka * OpenShift * Services * AWS * Kubernetes The role would be perfect for someone who loves taking their ideas into production, is committed to best practices and more »
You'll also get exposure to Python, lots of SQL (of course) and depending on your level of experience, data stream processing tools like Kafka, Spark, etc. As these companies continue to build new platforms and modernise, you’ll also be exposed to the cloud and various other modern more »
experience working in an AWS environment, utilising tooling such as Kubernetes and Docker Any experience in the following would be advantageous but not essential; Kafka, Cassandra (data storage), Envoy Proxy (RPC). This role is a truly unique opportunity for a Backend Engineer to join an organisation renowned for more »
experience working in an AWS environment, utilising tooling such as Kubernetes and Docker Any experience in the following would be advantageous but not essential; Kafka, Cassandra (data storage), Envoy Proxy (RPC). This role is a truly unique opportunity for a Backend Engineer to join an organisation renowned for more »
in testing and build/deploy concepts. Familiarity with CI/CD pipelines using tools like Jenkins, Github Actions, Chef, Ansible, or Nexus. Familiarity Kafka Austin Fraser is committed to being an equal opportunities employer, and encourages applications from candidates regardless of sex, race, disability, age, sexual orientation, gender more »