modern front-end workflows Solid knowledge of Docker (Kubernetes experience a plus) Experience shipping SaaS platforms end-to-end Strong communication and debugging skills Bonus if youve worked with: Kafka, GraphQL, GitHub Actions, Jenkins, or Golang. ?? Why Valent? Mission-led : Build tools that protect global information integrity. High impact : Direct involvement in a vital AI system used by governments More ❯
the Role: Moderate experience in virtualization technologies like VMWare, OpenStack, Xen Knowledge of containerization (basic level) Exposure to cloud platforms such as AWS, GCP, Azure, or OPC Adept at Kafka, Cassandra, Apache Spark, HDFS Monitoring experience with Prometheus, Nagios, Icinga Experience with logging tools like Splunk or ELK stack Handling configuration management with Ansible, Terraform, Chef, or Puppet Comprehensive More ❯
Hounslow, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
you’ll bring Solid commercial experience with Node.js and Typescript on large scale microservices architectures Extensive work on extremely high-traffic, distributed systems Good knowledge of Kubernetes and either Kafka or RabbitMQ Deep understanding of AWS and related tools Strong Computer Science fundamentals Happy working fully remote in a B2B capacity -- We make an active choice to be inclusive More ❯
customer segmentation, and real-time personalization. Hands-on experience with agile product development methodologies. Excellent communication and stakeholder management skills. Knowledge of modern data tools (e.g., Snowflake, Databricks, dbt, Kafka). Understanding of machine learning workflows and personalization engines. Product certifications (e.g., SAFe, Pragmatic, CSPO). Key Success Metrics: Consistent development roll outs of the Horizon CDP platform Increased More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
Orchestration: Deep knowledge of Kubernetes and container security. Experience in managing global or multi-cluster deployments is highly valued. Distributed Systems & Messaging: Strong familiarity with clustered environments, distributed storage, Kafka, Aeron, and protocols like multicast or other high-performance computing technologies. Automation & IaC: Proficient in Python, Golang, or Rust with a solid track record using IaC tools and building More ❯
data management (privacy, consent, encryption) Experience working with customer data platforms such as Salesforce or similar Excellent communication and stakeholder engagement skills Exposure to big data tools (Hadoop, Spark, Kafka) Knowledge of integrating ML models and AI into data platforms Industry certifications (e.g. CDMP, AWS, Azure) Experience with data visualisation tools (Power BI, Tableau, Looker) This role offers: A More ❯
and messaging standards ISO20022 SWIFT MTMX SEPA RTGS Experience with Temenos Transact T24 data structures APIs and accounting framework Proficient in working with web services RESTSOAP message queues eg Kafka MQ and middleware Experience with SQL,Oracle databases Linux scripting and monitoring tools More ❯