professional experience in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling More ❯
Uses database migration patterns, such as, "expand and contract" using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft skills: Good collaboration skills at all levels with cross-functional teams Highly developed ownerships and creative thinking More ❯
London, England, United Kingdom Hybrid / WFH Options
Prima
an Agile environment Nice-to-Have Good knowledge of functional programming languages Professional experience with at least one of Rust or Elixir Knowledge of TDD Knowledge of RabbitMQ/Kafka Why you’ll love it here We want to make Prima a happy and empowering place to work. So if you decide to join us, you can expect plenty More ❯
specifically AWS - enterprise software implementations. Experience with Agile software development. Experience with big data applications. Experience or Interest in building AI capabilities for Operations. Experience in data technologies like Kafka, Elastic, Spark, NiFi. Familiarity with Kubernetes deployment, Agile methodologies, and tools. Experience with NetCool IBM products. Experience with Watson X IBM Products. Experience with LLM products and capabilities. Experience More ❯
London, England, United Kingdom Hybrid / WFH Options
ZILO™
days each week. Key Responsibilities: Design and develop highly scalable and reliable services in GO language Collaborating with cross-functional teams to design, develop, and test software solutions Kafka integration and implementation with Go services Leverage the corporate AI assistant and other strategic coding tools to enhance development workflows Actively use AI tools to support code generation, debugging, documentation More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
QAD
Uses database migration patterns, such as, “expand and contract” using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft Skills Good collaboration skills at all levels with cross-functional teams Highly developed ownerships and creative thinking More ❯
experiences of SDLC methodologies e.g. Agile, Waterfall Skilled in business requirements analysis with ability to translate business information into technical specifications Skills Required (desirable): Knowledge of streaming services – Flink, Kafka Knowledge of Dimensional Modelling Knowledge of No Sql dbs (dynamo db, Cassandra) Knowledge of node based architecture, graph databases and languages – Neptune, Neo4j, Gremlin, Cypher Experience 5+ years of More ❯
Create and maintain Forms, Reports, Views, Workflows, Groups and Roles Create, maintain and enhance Dashboards and reporting, including scheduled reports Create and configure tool integrations with ServiceNow (Elastic, Netcool, Kafka, etc) Coordinate and support application and platform upgrades Assist with design, creation and cataloging of business process flows Work with other members of the ServiceNow development team to propose More ❯
London, England, United Kingdom Hybrid / WFH Options
Satalia (NPComplete)
experience as a software engineer Prefer functional programming style where appropriate Aware of the advantages of relational and document based databases, including CAP theorem Know how queues such as Kafka, SQS, Nsq work and their PROS/CONS Prefer TDD and BDD Knowledge and understanding how integrated systems are deployed and managed Be Unix philosophy advocate, who prefers working More ❯
focus. Effective communicator in both written and verbal mediums. Beneficial Skills & Qualifications Prior experience working on an electronic trading platform, e.g. reference data, market data & FIX. Knowledge of Spring, Kafka, SQL and/or Linux. Prior experience designing and implementing distributed systems modelling complex workflows. Prior experience in the financial industry. Understanding of common data structures and optimisations regarding More ❯
3+ years of experience in cloud architecture and implementation - Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience - Experience in database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) - Experience in consulting, design and implementation of serverless distributed solutions - Experience in software development with object oriented language PREFERRED QUALIFICATIONS - AWS experience preferred, with proficiency in a wide More ❯
/ELK/Jaeger/Opentelemetry/Service Meshes etc.). Experience with configuration management tools (Ansible/Puppet/Kapitan/Terraform). Experience with distributed data platforms (Kafka/Flink/Airflow). Comfortable using cloud-native and containerisation technologies (Kubernetes/Docker). Good Linux systems knowledge (experience with RHEL desirable). Broad knowledge across network More ❯
/Computer Science preferred but not required); or equivalent experience required Deep proficiency in Python, SQL, Cloud Platforms AWS, GCP, Azure). Data Warehousing (Snowflake), Orchestration (Airflow, Rundeck), Streaming (Kafka) Continuous engagement with Data Science and Analytics colleagues to understand requirements for our data-assets and empower them with best possible data, to create high value analytical services Ownership More ❯
and technologies: React and Typescript for our frontend Jest for tests SwiftUI for our Driver iOS App Python for our backend code Postgres for data storage Redis for caching Kafka for stream processing AWS, Terraform, GitLab CI/CD, Docker and ECS to deploy and run our services Flutter for our on-board server running Android, which handles concession More ❯
FX, and commodities. Proven experience in designing and implementing enterprise-grade architectures. Familiarity with microservices, containerized solutions like Kubernetes/Docker, and event-driven architecture and messaging systems (e.g., Kafka, RabbitMQ). Experience with cloud platforms and services, particularly in high-frequency trading and data processing. Experience in building scalable, resilient, and secure cloud-based solutions. Ability to work More ❯
chargers, calculating ETAs, monitoring traffic and keeping passengers informed. We rely on the following tools and technologies: Python for our backend code Postgres for data storage Redis for caching Kafka for stream processing React for our frontend Clickhouse for analytics SwiftUI for our Driver iOS App AWS, Terraform, GitLab CI/CD, Docker and ECS to deploy and run More ❯
functional central projects. Strong background in cloud computing and microservices architecture, preferably with Google Cloud (GCP). Solid understanding of message brokers, event-driven architectures, and asynchronous communication (e.g., Kafka, Pub/Sub, RabbitMQ). Experience designing and documenting APIs, data models, and system integrations using OpenAPI 3.0. Ability to analyze business requirements and translate them into scalable, AI More ❯
London, England, United Kingdom Hybrid / WFH Options
Tide Platform Limited
technical decisions. OUR TECH STACK Our teams build loosely coupled microservices, enabling fast, autonomous development. Our architecture prioritizes scalability, security, and resilience, ensuring seamless financial transactions. We use: Messaging: Kafka, AWS SQS/SNS Frontend: Flutter (Mobile), Angular and React (Web) WHAT YOU’LL GET IN RETURN Our location-specific employee benefits are designed to cater to the unique More ❯
professional experience in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling More ❯
Kubernetes, Docker, Jenkins, Terraform, git, Ansible, AWS cloud technologies such as Lambdas, EC2, EKS and CloudFront. We deliver and support common, modern applications in our environments such as RabbitMQ, Kafka, MariaDB/MySQL, Redis as well as many others. Job Duties Develop standard application stacks using Infrastructure as Code (IaC) to address common use cases. Build on the prototype More ❯
experience with deployment, configuration, and troubleshooting in live production systems. Experience with Messaging Systems: You have experience with distributed systems that use some form of messaging system (e.g. RabbitMQ, Kafka, Pulsar, etc). The role is focusing on RabbitMQ and you will have time to acquire deep knowledge in it. Programming Proficiency: You have some proficiency in at least More ❯
London, England, United Kingdom Hybrid / WFH Options
Odeko, Inc
scalable, mission-critical production applications Experience with later-stage startup environments (Series D - E) Experience in both technology and food & beverage industries is highly desirable Familiarity with Federated GraphQL, Kafka, Kubernetes, Docker, microservices, Netsuite, and Vue Previous experience in fast-paced, innovative settings A passion for continuous learning and professional development What We Offer Private Medical Cover Excellent Work More ❯
strengthen an application: Passion for transportation or sustainable technologies Deeper experience with parts of our stack, eg Go, Typescript, react Terraform or other Infrastructure as Code tooling Exposure to Kafka, event driven architectures, or message queues Familiarity with HashiCorp Vault or other secrets management tooling Deeper knowledge of CI/CD pipelines Experience in a start-up or scale More ❯
stores/Databases, Application/API design, Internet and Networking Protocols, Security Architecture, Version Control Systems, and CI/CD tools. Deep experience with several of the following platforms: Kafka, Flink, Spark, Kubernetes, API Gateways, and messaging systems (Tibco, IBM MQ, and open-source variants); Deep familiarity with data streaming, integration patterns, and how they can be best utilized More ❯