quickly, and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, Apache Kafka, ApachePulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building More ❯
quickly, and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, Apache Kafka, ApachePulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building More ❯
quickly, and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, Apache Kafka, ApachePulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building More ❯
quickly, and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, Apache Kafka, ApachePulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building More ❯
london (city of london), south east england, united kingdom
Hadte Group
quickly, and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, Apache Kafka, ApachePulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building More ❯
accessible, and reusable Implement schemas, contracts, and observability Support compliance with global data regulations What you’ll bring Good working experience in data engineering Hands-on with Kafka/Pulsar/Kinesis Strong SQL & relational DBs (Postgres/MySQL) Coding experience in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT More ❯
trustworthy, accessible, and reusable Implement schemas, contracts, and observability Support compliance with global data regulations What youll bring Good working experience in data engineering Hands-on with Kafka/Pulsar/Kinesis Strong SQL & relational DBs (Postgres/MySQL) Coding experience in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT More ❯
accessible, and reusable Implement schemas, contracts, and observability Support compliance with global data regulations What you’ll bring Good working experience in data engineering Hands-on with Kafka/Pulsar/Kinesis Strong SQL & relational DBs (Postgres/MySQL) Coding experience in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT More ❯
to support enterprise-scale Business Planning Software solutions. Your Impact Design, build, and operate platform capabilities supporting batch, streaming, and AI-driven workloads Develop resilient and scalable systems using Apache Kafka, Flink, Pulsar, and cloud-native technologies Collaborate with AI/ML teams to deploy models and enable generative AI use cases Implement integrations with data lakes and … Your Qualifications 8+ years of hands-on experience in software engineering, especially in platform/backend systems Expert-level skills in Java and strong proficiency in Python Experience with Apache Kafka, Flink, and Pulsar for building distributed data pipelines Familiarity with scalable data storage and data lake integrations Proven ability to integrate AI/ML models and work More ❯
middlesbrough, yorkshire and the humber, united kingdom
Anaplan
to support enterprise-scale Business Planning Software solutions. Your Impact Design, build, and operate platform capabilities supporting batch, streaming, and AI-driven workloads Develop resilient and scalable systems using Apache Kafka, Flink, Pulsar, and cloud-native technologies Collaborate with AI/ML teams to deploy models and enable generative AI use cases Implement integrations with data lakes and … Your Qualifications 8+ years of hands-on experience in software engineering, especially in platform/backend systems Expert-level skills in Java and strong proficiency in Python Experience with Apache Kafka, Flink, and Pulsar for building distributed data pipelines Familiarity with scalable data storage and data lake integrations Proven ability to integrate AI/ML models and work More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
london, south east england, united kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
of network API delays. Define and own the data contracts and pipelines that feed this "ground-truth" network data from the integration layer to our core AI Service Bus (Apache Kafka). Cross-functional Collaboration Work closely with the Scam Detection Service and AI/ML teams to define the feature vectors and data payloads needed from the network … Knowledge of (or deep, demonstrable curiosity about) telecommunications protocols and architectures. You must be comfortable talking to network engineers. Experience with high-throughput messaging or streaming platforms (eg, Kafka, Pulsar). This is a permanent position with hybrid working of two days a week in the central London office and the rest WFH. The salary is very much Dependent More ❯