data governance processes. Experience with version control systems (e.g., Git). Experience with Agile development methodologies. Excellent communication, interpersonal, and problem-solving skills. Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs). Experience with data visualisation tools (e.g., Tableau, Power BI). Experience with DevOps tools and practices (e.g., Azure DevOps, Jenkins, Docker, Kubernetes). Experience working More ❯
and problem-solving skills, with attention to detail. Excellent communication and collaboration skills. Nice to Have: Experience with containerized environments (Docker/Kubernetes). Exposure to other messaging platforms (Kafka, RabbitMQ, MQ). Understanding of DevOps tools and CI/CD pipelines. Knowledge of cloud environments (AWS, Azure, GCP) and cloud-native Solace deployments. Why Join Us? Join a More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
expertise with GCP including BigQuery, Pub/Sub, Cloud Composer and IAM You have strong Python, SQL and PySpark skills You have experience with real-time data streaming using Kafka or Spark You have a good knowledge of Data Lakes, Data Warehousing, Data Modelling You're familiar with DevOps principles, containerisation and CI/CD tools such as Jenkins More ❯
also enable teams to ship fast. Nice to Have Experience delivering API services (FastAPI, SpringBoot or similar). Experience with message brokers and real-time data and event processing (Kafka, Pulsar, or similar). Why Join Us You'll be part of a small, high-output team where intensity and focus are the norm. You'll own the infrastructure More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Axiom Recruit
5+ years’ software engineering experience , ideally in fintech, banking, or payments. Strong backend skills with Node.js (NestJS a plus), TypeScript, SQL, and distributed systems. Experience with event-driven architecture (Kafka, Google Pub/Sub) and modern CI/CD pipelines. Frontend development expertise with Angular or similar frameworks. At least an understanding and leaning towards Crypto markets, with commercial More ❯
and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, ApacheKafka, Apache Pulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building products that make More ❯
etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as hadoop, spark, or kafka is a plus. 5. Familiarity with scripting languages like python, java, or shell scripting. 6. Excellent analytical and problem-solving skills with a keen attention to detail. 7. Ability More ❯
collaborating with cross functional teams Strong Background and experience in Data Ingestions, Transformation, Modeling and Performance tuning. Should have experience in designing and developing dashboards Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Should have experience in creating roadmap to improve platform Observability Experience in leading mid-scale teams with strong communication skills Experience in Machine Learning and GCP More ❯
Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good More ❯
pipelines. Define non-functional requirements and performance acceptance criteria. Analyze performance test results and production metrics to identify trends, risks, and improvements. Work with modern technologies including Docker, Kubernetes, Kafka, and NoSQL databases. Mentor junior testers and drive continuous improvement in performance testing practices. What we’re looking for: 5+ years’ experience in non-functional/performance testing for More ❯
specialising in Go to contribute to v1 builds, define architecture, and ship mission-critical backend systems from scratch. What you’ll do: Design & scale a fault-tolerant ledger (Golang, Kafka, CockroachDB/YugabyteDB) Architect distributed, multi-region infra with five-nines reliability Contribute to backend, DevOps, and CI/CD decision-making Embed AI into engineering workflows & ops Self More ❯
on test automation, identifying test cases to be automated, writing scripts and integrating them into the development lifecycle. Engaging with the wider DevOps lifecycle tool chain, like Kubernetes and Kafka, with future DevOps opportunities for you. Collaborate with developers to improve code and streamline testing. Work with HIL and hardware interactions (no experience required). Apply below for more More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
Senior Data Engineer (Kafka Python AWS) London/WFH to £80k Are you a data technologist with fluent Python coding skills who enjoys working with and continually learning new technologies? You could be progressing your career in a senior, hands-on role at a "Tech for Good" company that is enabling life changing education to be accessed by millions … home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python, Java or Scala coding skills You have experience with Kafka for data streaming including Kafka Streams and KTables You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including S3 and Lambda You have More ❯
microservice-based Loyalty and Benefits platform, designed to be able to handle all aspects of the Loyalty and Benefits customer experience, globally. Built using modern tools such as Golang, Kafka and Docker, there is ample opportunity to drive innovation and grow knowledge and skills as an Engineer. As a Software Engineer on an Scrum team, you will be building … at least one back-end type safe programming language (Golang Preferred) · Comfortable/experienced with back-end micro-service architecture and communication, specifically REST and asynchronous messaging services (e.g., Kafka, RabbitMQ etc.) · Comfortable/experience within a Scrum framework working with as part of a team to deliver business functions and customer journeys that are tested and automated throughout … software engineering methodology (Agile, incl Scrum, Kanban, SAFe, Test-Driven Development (TDD), Behavior Driven Development (BDD) and Waterfall) Knowledge of any or all of the following technologies is desired: Kafka, Postgres, Golang, Git, gRPC, Docker, GraphQL · Experienced in continuous integration (CI), continuous deployment (CD) and continuous testing (CT), including tools such as Jenkins, Rally and/or JIRA and More ❯
researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader data … programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering, or More ❯