London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and advocate for scalable, efficient data processes and platform enhancements Tech Environment: Python, SQL, Spark, Airflow, dbt, Snowflake, Postgres AWS (S3), Docker, Terraform Exposure to Apache Iceberg, streaming tools (Kafka, Kinesis), and ML pipelines is a bonus What We're Looking For: 5+ years in Data Engineering, including 2+ years in a leadership or management role Experience designing and More ❯
Data Science, Engineering, or a related field. Strong programming skills in languages such as Python, SQL, or Java. Familiarity with data processing frameworks and tools (e.g., Apache Spark, Hadoop, Kafka) is a plus. Basic understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of database systems (e.g., MySQL, PostgreSQL, MongoDB) and data warehousing concepts. More ❯
Comfortable balancing technical excellence with business priorities and constraints Nice to have Experience building Data Mesh or Data Lake architectures. Familiarity with Kubernetes, Docker, and real-time streaming (e.g. Kafka, Kinesis). Exposure to ML engineering pipelines or MLOps frameworks. What's it like to work at Zego? Joining Zego is a career-defining move. People go further here More ❯
Comfortable balancing technical excellence with business priorities and constraints Nice to have Experience building Data Mesh or Data Lake architectures. Familiarity with Kubernetes, Docker, and real-time streaming (e.g. Kafka, Kinesis). Exposure to ML engineering pipelines or MLOps frameworks. Whats it like to work at Zego? Joining Zego is a career-defining move. People go further here, reaching More ❯
other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as ApacheKafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order management, algorithmic trading More ❯
or Java. Client-side web apps are written in React, and some services in Clojure, Java and Go. Our platform consists of: Multiple Kubernetes Cluster for Container orchestration ApacheKafka and Redis shortly Postgres for event messaging Postgres for data storage OpenStack Swift for Object storage Juniper & Cisco networking devices A number of internally written tools for managing the More ❯
Role: Java Developer Duration: Till end of this year (Extendable) Location: Birmingham, UK (Onsite) Description: 4+ years of Java Development experience- Java 11+ Spring Experience working with microservices architecture Kafka for messaging - to some extent or equivalent MongoDB - to some extent or equivalent Strong understanding of SDLC Nice to haves: Gitlab - experience with others is generally fine, they can More ❯
own solution architecture delivery. Ensure robust governance, security, and performance standards. Requirements: Recent experience as a Lead Data Solution Architect or equivalent. Skilled in streaming/event-driven architectures (Kafka, Confluent). Deep knowledge of Databricks, Unity Catalog, and Snowflake. Understanding of Data Mesh/Fabric and product-led approaches. Familiarity with cloud platforms (AWS, Azure, GCP). Leadership More ❯
and techniques of computer science, engineering, and mathematical analysis to the development of complex architectures We encourage you to apply if you have experience with any of the following: Kafka (required) Linux (required) Java (required) React (required) Docker/Kubernetes (required) DevSecOps (preferred) Cloud Services (preferably AWS) MPLS/Networking (preferred) GitLab/Jenkins (preferred) Jira (preferred) Required Education More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
NonStop Consulting
Implementing API's for internal and external use. Desirable :Knowledge of Mongo DB/Jackson/JAX-RS/mocking frameworks such as Mockito, and messaging platforms(ActiveMQ or Kafka). Interviews will start as soon good candidates are available, if you are interested do not delay the response and forward you updated CV today. If you would like More ❯
Start date - 7th AugustLocation TBD (Newcastle, Leeds or Manchester) - weekly travel Inside IR35Front end: Must be WCAG2.2 AAA compliant (accessible).Backend: Mix of microservices, lambda functions, and (Kafka) events. Some ETL and interaction with RPA. Likely to use microsites and/or UI fragments. Responsibilities: Set and enforce code quality standards (e.g. SonarQube unit test coverage%) Elaborate, estimate and More ❯
charts • Deploying and working with Argo CD and Argo Workflows • Creating GitLab CI jobs • Familiar with S3/MinIO and related APIs • Publishing and subscriber queues like RabbitMQ or Kafka We have deep experience in signal processing application and common services development for the National agencies of the Intelligence Community (IC) and the Department of Defense. We develop and More ❯
UI) Experience testing Windows applications, Oracle, and SQL Server databases Understanding of Agile (Scrum/Kanban) and experience transitioning from Waterfall Familiarity with Jira, Zephyr, and Confluence Exposure to Kafka, Azure, Jenkins, and Java What We're Looking For: 7+ years' experience in technology-focused testing roles Proven ability to test complex, high-transaction user systems Strong knowledge of More ❯
charts • Deploying and working with Argo CD and Argo Workflows • Creating GitLab CI jobs • Familiar with S3/MinIO and related APIs • Publishing and subscriber queues like RabbitMQ or KafkaMore ❯
flight dynamics models Experience developing software using C++ and Python Experience working with a large-scale legacy software system Experience with tools such as Confluence, Eclipse, Jira, Jenkins, Junit, Kafka, and Spring Boot Responsibilities: Design, develop, test, deliver, and maintain software for satellite ground systems Modernize software systems and upgrade Commercial off-the-shelf COTS and Free and open More ❯
preventive actions. Maintain service dashboards, alerts, and incident tooling (e.g., PagerDuty, Datadog). Technical Expertise required for this engagement: Guide operational practices across services built using Java (Spring Boot) , Kafka , MongoDB and related technologies. Oversee monitoring, observability, and performance tuning using Datadog , ELK , Prometheus , or similar tooling. Problem Management & Root Cause Elimination required: Lead proactive and reactive problem management More ❯
data-driven decision-making What You'll Bring Proven experience in cloud-based data engineering (Azure, Databricks) Strong Python/PySpark skills and DevOps automation experience Familiarity with Kubernetes, Kafka/Event Hub, and AI/ML platform integration Certifications in Azure, Databricks, or data governance tools (a plus) A collaborative mindset and a passion for continuous improvement What More ❯
or ideally commodities would be of particular interest. The ideal Senior .Net Developer will have: Minimum 5+ years C# and .Net development experience Experience with Enterprise Messaging tools i.e. Kafka, Azure service bus etc. Experience working within a trading environment (energy or commodities preferred) Strong experience with distributed architecture and modern CI/CD practices (Docker, Kubernetes) Ability to More ❯
St. Albans, Hertfordshire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
enhancements to complex Payments and client systems within a microservices environment (300 services). You'll be working with a modern tech stack using C# .Net Core, AWS, Kubernetes, Kafka, Redis and TypeScript/Angular; using the right tool for the job, you'll be able to pick up new technologies and make recommendations for improvements. Location/WFH More ❯
Forta network Familiarity with Forta CLI, Forta Explorer, and Alert API Exposure to security tooling and real-time detection systems Experience with message queue systems and alert forwarding (e.g., Kafka, Webhooks) Previous work in blockchain/crypto environments Oneida Technical Solutions is an equal opportunity employer. Qualified candidates will be considered without regard to legally protected characteristics. More ❯
/planning with a colleague and raise an RFC Our Tech Stack Our services are written with C# on .NET 8 We use a fair bit of SQL Server, Kafka and RabbitMQ Azure DevOps. We are big fans of Azure Pipelines! Some of our services are migrating away from TeamCity and Octopus Deploy Our observability stack is Splunk, Grafana More ❯
get more value from model outputs as needs evolve. Requirements Proven experience designing and building robust backend systems using technologies such as Node.js, Python, MongoDB, PostgreSQL, Redis, RabbitMQ, and Kafka Demonstrated ability to scale production systems in cloud-native environments, familiarity with Kubernetes, NGINX, and container orchestration Experience building and maintaining real-time applications, including working with WebSockets and More ❯
get more value from model outputs as needs evolve. Requirements Proven experience designing and building robust backend systems using technologies such as Node.js, Python, MongoDB, PostgreSQL, Redis, RabbitMQ, and Kafka Demonstrated ability to scale production systems in cloud-native environments, familiarity with Kubernetes, NGINX, and container orchestration Experience building and maintaining real-time applications, including working with WebSockets and More ❯
techniques and performance optimisation. • Familiarity with data governance, security, and privacy best practices. • Experience with network-centric datasets (fiber, GPON, ethernet, Wi-Fi telemetry). • Exposure to streaming technologies (Kafka, Event Hubs) and real-time analytics. • Knowledge of Machine Learning Ops (MLflow, Databricks). Deadline: ASAP Contract Type: Full Time Location: London Interested? The full job specification can be More ❯
to travel to conferences, and dedicated time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is More ❯