backend can provide the best Customer Experience You pride yourself on consistent high levels of test coverage, strong technical documentation and effective monitoring Preferably exposure to technologies such as Kafka, PostgreSQL, Redis We use Kotlin, PostgreSQL, Kafka, Redis, Datadog, Amplitude, Grafana, BigQuery, ApacheSpark and more A passion for crypto and the transformations it enables COMPENSATION & PERKS Full-time More ❯
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like ApacheKafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a … Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding ApacheKafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on Agile Delivery More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: ApacheKafkaApache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or emailing me … SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
Gloucester, Gloucestershire, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
across government The Lead Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required experience in the following: ApacheKafkaApache NiFI SQL and NoSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java Understand and interpret technical and business stakeholder needs Manage expectations through clear … SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: ApacheKafkaApache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or emailing me … SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
of open-source technology and multi-cloud solutions. Our vision is to become the trusted Data & AI Platform for everyone, leveraging the most popular open-source technologies like ApacheKafka, Aiven for PostgreSQL, Aiven for Clickhouse, and Aiven for OpenSearch. to help companies accelerate time-to-market, drive efficiency, and build innovative solutions across any cloud. Right now, we … and post-sales capacity. Suitable candidates must have proven experience in one or more of the following areas that will be required in their day-to-day job: ApacheKafka, Apache Cassandra, Clickhouse, PostgreSQL, MySQL, OpenSearch, or Redis. The position is full-time and permanent, located in London. What You'll Do: Help customers succeed, building successful relationships from … various Linux utilities and tools (installation, securing system, storage, etc.) and a good grasp of networking basics like DNS A relatively deep conceptual and hands-on understanding of ApacheKafka, Apache Cassandra, Clickhouse, PostgreSQL, MySQL, OpenSearch, or Redis. Act as a subject matter expert on Aiven service offerings Attention to detail and good problem-solving and project management skills More ❯
into the platform. Your Impact Build and maintain core platform capabilities that support high-throughput batch, streaming, and AI-powered workloads. Develop resilient, observable, and scalable systems using ApacheKafka, Flink, Pulsar, and cloud-native tools. Collaborate with AI/ML engineers to operationalize models and enable generative AI use cases such as prompt-based insights or automation. Deliver … engineering experience (or equivalent) with deep experience in platform/backend systems. Expert-level skills in Java, with strong proficiency in Python. Experience building distributed data pipelines using ApacheKafka, Flink, and Pulsar. Familiarity with data lakes and scalable data storage patterns. Demonstrated experience integrating with AI/ML models, including LLMs and prompt-based applications. Proven capability in More ❯
into the platform. Your Impact Build and maintain core platform capabilities that support high-throughput batch, streaming, and AI-powered workloads. Develop resilient, observable, and scalable systems using ApacheKafka, Flink, Pulsar, and cloud-native tools. Collaborate with AI/ML engineers to operationalize models and enable generative AI use cases such as prompt-based insights or automation. Deliver … engineering experience (or equivalent) with deep experience in platform/backend systems. Expert-level skills in Java, with strong proficiency in Python. Experience building distributed data pipelines using ApacheKafka, Flink, and Pulsar. Familiarity with data lakes and scalable data storage patterns. Demonstrated experience integrating with AI/ML models, including LLMs and prompt-based applications. Proven capability in More ❯
skills in Python or another major language; writing clean, testable, production-grade ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC … data contracts, and data cataloguing. API & Integration Fluency: Building data ingestion from REST/gRPC APIs, file drops, message queues (SQS, Kafka), and 3rd party SaaS integrations, with idempotency and error handling. Storage & Query Engines: Strong with RDBMS (PostgreSQL, MySQL), NoSQL (DynamoDB, Cassandra), data lakes (Parquet, ORC), and warehouse paradigms. Observability & Quality: Deep familiarity with metrics, logging, tracing, and More ❯
.NET Developer (FinTech) - London - £500-550 per day (.NET, C#, Azure, SQL, Kafka, Microservices) We're working with an industry leading currency exchange platform that are looking for a .NET Developer to come and join them for an initial 6-month contract. This person will be tasked with working alongside an existing engineering team as they move away from … role. Previous FinTech/Payments/Exchange experience would be highly advantageous but not a prerequisite for this role. Technical requirements - .NET C# SQL Azure (Functions, LoigcApps, ServiceBus etc) Kafka Docker Microservices CI/CD Pipelines If you're available in the next 2 weeks then please apply here as this role requires the right person to start ASAP. … .NET, C#, Azure, SQL, Kafka, Microservices) .NET Developer (FinTech) - London - £500-550 per day More ❯
excited by the opportunity to build new things. You will have many examples of successfully delivering complex enterprise software projects. We use technologies such as C++, Java, Kotlin, Kubernetes, Kafka, ElasticSearch and AWS. You'll be comfortable using some of these to research and prototype new approaches. Natural collaborator . It takes a village to do anything worthwhile, and … systems Familiar with Docker, Kubernetes and cloud technologies such as AWS Experience with processing large data sets on Elasticsearch or similar data stores Experience with event-driven architectures (CQRS, Kafka Streams etc) Understanding of networking fundamentals Cisco values the perspectives and skills that emerge from employees with diverse backgrounds. That's why Cisco is expanding the boundaries of discovering More ❯
DevSecOps Engineer Consortia has partnered with an innovative Fintech company at the forefront of transforming the financial services industry. This dynamic environment offers you a unique opportunity to be part of a forward-thinking organisation that values innovation, collaboration, and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
Rust Developer - Crypto exchange Up to £135,000 + Bonus Join the best-performing cryptocurrency exchange and help design and develop cutting-edge, trader-neutral Rust platforms used by over 9 million users daily. As a Rust Developer, you can More ❯
the ground-up with a deep understanding of core Java, data structures and concurrency, rather than relying on frameworks such as Spring. You have built event-driven applications using Kafka and solutions with event-streaming frameworks at scale (Flink/Kafka Streams/Spark) that go beyond basic ETL pipelines. You know how to orchestrate the deployment of More ❯
InsurTech with cutting-edge machine learning and AI? At Simply Business, we're not just using data; we're continuously evolving our platform with technologies like AWS, Snowflake, and Kafka to drive real value and inform company strategy. As a leading player in the market, our mission is to remain at the forefront of data engineering, ML, and AI … continuously evolving our class-leading data and ML platform infrastructure, balancing maintenance with exciting greenfield projects. develop and maintain our real-time model serving infrastructure, utilising technologies such as Kafka, Python, Docker, Apache Flink, Airflow, and Databricks. Actively assist in model development and debugging using tools like PyTorch, Scikit-learn, MLFlow, and Pandas, working with models from gradient boosting More ❯
Database Management System (RDBMS) Web app server Familiar (Experience preferred) with open source web application server such as Apache HTTP, Tomcat Middle tier - messaging systems Experience using the following: Kafka - or any Java-based message broker Active MQ - product we are using that is java based message broker or similiar Experience in development and usage with RESTful web services … in Relational Database Management. 10 Years' Experience with Angular 3+ front-end web application platform and PrimeNG UI components or similar web framework. Experience with middle tier messaging systems, Kafka, Active MQ or similiar and development and usage with RESTful web services Active Top Secret or TS/SCI security clearance. Need to have Current Poly or consent to More ❯
Inside IR35 Support the standing up multiple environments on AWS Support the management of the AWS stack/Git pipelines across mix of React front-end, microservices, lambda functions, Kafka integration, possible mix of Transit Gateway/Private Link, use of Kong EE Skills required: REQUIRED/NON-NEGOTIABLE: - Full AWS stack (inc. Lambda, SQS, SNS) - IAM management for … pipelines and users - Terraform or Cloud Formation - CloudWatch - Kubernetes NICE TO HAVE: - Kafka - Kong EE - Micro-UI patterns (jwt tokenisation and passthrough More ❯
database systems to the next level. We're looking for a highly-technical, hands-on engineer, who loves to work with data plane services like Cassandra, ElasticSearch/Opensearch, Kafka, Redis, Valkey, MySQL, PostgreSQL and is comfortable building automation around large-scale cloud-based critical systems. We'll be looking at candidate CVs with an eye on achievement what … can do for us in the future. Focus area: OpenSearch and ElasticSearch What You'll Do: Maintain a deep understanding of the data components - including Cassandra, ElasticSearch/OpenSearch, Kafka, Zookeeper, MySQL and PostgreSQL, Redis, Valkey, Memcache, Pulsar, SQS and use that understanding to operate and automate properly configured clusters. Understanding of operating databases in kubernetes and managing container … data safe, secure, and available. What You'll Need: Configuration management (Chef) Scripting in Python and bash Experience with large scale datastores using technologies like Cassandra, ElasticSearch/Opensearch, Kafka, Zookeeper,MySQL, PostgreSQL, Redis, Valkey, Memcache, Pulsar, SQS Experience with large-scale, business-critical Linux environments Experience operating within the cloud, preferably Amazon Web Services, GCP and OCI Proven More ❯
Currently, our micro-services communicate via REST API calls, fostering seamless integration between different services. You will actively participate in our ongoing transition towards an event-driven architecture, utilising Kafka as a core component. We are proud to say that our engineers at ClearScore are world class and at the heart of making this mission a reality for our … millions of users. The Typelevel Stack (Cats, Cats Effect, http4s, Circe), Kafka, SBT and occasionally Akka HTTP A world-class SRE team who champion our "you build it you run it" principle, empowering our developers to work with AWS, Kubernetes and Spinnaker A Quality Assistance programme to build, test, release and monitor your own work TDD and peer-reviewing More ❯
strong commitment to user-centred design and agile delivery, and more to deliver innovative digital services that matter Preferred Tech Stack Expertise Cloud Infrastructure : AWS (EKS, RDS, Aurora, ElastiCache, Kafka, IAM) Secure Hosting : Experience working with air-gapped or government-secure environments Secrets & Identity Management : HashiCorp Vault, Keycloak Automation : IaC, pipeline build automation, event relay tooling Scripting : Bash, Python … on-call support . Ensure all services are compliant with security standards and support the change and release governance model . Build and maintain infrastructure components like event streaming (Kafka), databases (Aurora, RDS, Redis), identity management (Keycloak), and caching layers. Enhance and maintain CI/CD tooling and self-service developer pipelines for tenant teams. Proactively manage and resolve More ❯
written in Dart with Flutter and available on both Android and iOS Our NALA for Business product is web only and written in React and Typescript. We use Postgres, Kafka, Redis and Vault We use and leverage AWS as much as possible and we manage it with Terraform We write unit and integration tests, do code reviews and deploy … least 5+ years of experience building highly reliable and scalable backend services in Go Experience with RDBMSs such as Postgres, MySQL etc. Experience with message-brokers technologies such as Kafka, RabbitMQ etc, working within event-driven architectures. You have excellent knowledge of the best practices in designing, developing and deploying those services in a cloud environment You have experience More ❯
/Elixir code Polyglot - All our services are built-in Ruby, Elixir, GraphQL federation or Typescript, depending on which language best suits the solution Messaging - For communication, we use Kafka for events and gRPC or JSON for synchronous calls. Kubernetes - All our services run in Kubernetes. Migration - We are in the process of switching away from our Ruby monolith … or TypeScript Distributed Systems - You understand how to build, deploy and maintain a globally distributed system. Event-driven architecture - Knowledge of event-driven systems and tools/protocols like Kafka, and gRPC will be a plus. Experience - Have experience ( 3+ years) working on internal product engineering teams, developer tools, developer productivity or infrastructure products at scale. Adaptable - Are a More ❯
Role: Data Scientist Clearance: TS/SCI Clearance Location: (Washington DC/Northern Virginia) Salary: $160k-$200k + Shares and bonus My client is an innovative company delivering cutting-edge AI and data analytics solutions through an advanced AI platform. More ❯
data processing and reporting. In this role, you will own the reliability, performance, and operational excellence of our real-time and batch data pipelines built on AWS, Apache Flink, Kafka, and Python. You'll act as the first line of defense for data-related incidents , rapidly diagnose root causes, and implement resilient solutions that keep critical reporting systems up … call escalation for data pipeline incidents, including real-time stream failures and batch job errors. Rapidly analyze logs, metrics, and trace data to pinpoint failure points across AWS, Flink, Kafka, and Python layers. Lead post-incident reviews: identify root causes, document findings, and drive corrective actions to closure. Reliability & Monitoring Design, implement, and maintain robust observability for data pipelines … Architecture & Automation Collaborate with data engineering and product teams to architect scalable, fault-tolerant pipelines using AWS services (e.g., Step Functions , EMR , Lambda , Redshift ) integrated with Apache Flink and Kafka . Troubleshoot & Maintain Python -based applications. Harden CI/CD for data jobs: implement automated testing of data schemas, versioned Flink jobs, and migration scripts. Performance Optimization Profile and More ❯
person insight into their missions, workflows, and perspectives, then utilize that knowledge to inform the platform's design. Core technical tasks include: REST API development in Java, working within Kafka streams to process and transform data, and general Java development to build and maintain the product. Responsibilities: Contribute to the development of enterprise-grade software solutions. Build and maintain … proficient with the project's graph database and develop complex database queries. Required Skills: Experience using Java to build enterprise products and applications. Knowledge of streaming analytic platforms like Kafka, RabbitMQ, Spark, etc. Familiarity with Extract, Transform, Load (ETL) software patterns to ingest large and complex datasets. Familiarity with Git and GitLab CI/CD. Understanding of common Enterprise … Integration Patterns (EIP) and how to apply them Desired Skills: Experience with graph databases such as Neo4j. Experience building real-time data processing applications using streaming libraries like Kafka Streams. Experience modeling data and relationships in graph databases. Experience with networking concepts, protocols, and analysis (routers, switches, etc.). Knowledge of SIGINT collection and analysis systems. Experience with production More ❯