Croydon, Surrey, England, United Kingdom Hybrid / WFH Options
eTeam Inc
Job Title: Senior Engineer with Node.js and Python or Back End (Java and ideally Kafka) Max rate: £537/Day on Umbrella Location: Croydon (Hybrid) Clearance required: SC Transfer (Active SC only ) Contract Duration: 05 months Experience of Java Experience of Spring framework or equivalent. Knowledge of software design patterns and when to apply them Excellent knowledge of development More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
container orchestration, Kubernetes, Istio and GKE in particular. Knowledge of cloud-native storage: GCS, S3 and filer solutions. Exposure to data services: Cloud SQL, managed databases, MongoDB. Messaging systems: Kafka, RabbitMQ and EMS; API gateways like Apigee. Observability tools: Prometheus, Grafana, Cloud Monitoring. IAM and secrets management: dynamic secrets, Vault. Test-driven development and automated testing frameworks. Several years More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
governance and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
state architecture (Tessa). Strong knowledge and experience with Agile testing processes and methodologies, preferably Scaled Agile Framework or similar. Good experience and skills within streaming technologies like Confluent Kafka, Streamsets, MongoDB, IBM CDC. Knowledge of public/enterprise cloud technologies (AWS EC2, S3 Bucket, GCP, Azure) is advantageous but not required. Some skills/experience with automated testing More ❯
and observability frameworks, including lineage tracking, SLAs, and data quality monitoring. Familiarity with modern data lake table formats such as Delta Lake, Iceberg, or Hudi. Background in stream processing (Kafka, Flink, or similar ecosystems). Exposure to containerisation and orchestration technologies such as Docker and Kubernetes. More ❯
pipelines (GitHub Actions, GitLab CI, CircleCI, etc.). Nice to have: Experience with async frameworks (FastAPI, Celery, or asyncio-based work). Exposure to event-driven architectures, message queues (Kafka, RabbitMQ) or pub/sub. Knowledge of observability tooling (Prometheus, Grafana, Sentry, ELK). Understanding of security best practices for web services (OWASP, authentication/authorization patterns). Experience More ❯
across research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay More ❯
scale API programmes Expertise in REST, OAuth2, OpenAPI, and API security Familiarity with banking processes and Open Banking standards Cloud (Azure/GCP), CI/CD, and integration tools (Kafka, IBM MQ) Strong stakeholder management and strategic thinking If the role is of interest please apply with your updated CV More ❯
understanding of REST, JSON, OpenAPI, and event-driven integration. Experience with API platforms such as Azure API Management or Apigee . Knowledge of secure messaging and integration tools (e.g. Kafka, IBM MQ). Experience designing solutions on major cloud platforms (preferably Azure ). CI/CD pipelines and DevOps practices are essential. More ❯
data at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise. Key responsibilities: Design and implement high-throughput data streaming solutions using Kafka, Flink, or Confluent. Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles. Develop tools and frameworks for data governance, privacy, and quality … advanced data use cases. Contribute to an engineering culture that values testing, peer reviews, and automation-first principles. What You'll Bring Strong experience in streaming technologies such as Kafka, Flink, or Confluent. Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals. Proven ability to design, deploy, and scale production-grade data platforms and More ❯