etl (extract, transform, load) processes and data warehousing. 3. Strong understanding of sql for data querying and validation. 4. Knowledge of big data technologies such as hadoop, spark, or kafka is a plus. 5. Familiarity with scripting languages like python, java, or shell scripting. 6. Excellent analytical and problem-solving skills with a keen attention to detail. 7. Ability More ❯
collaborating with cross functional teams Strong Background and experience in Data Ingestions, Transformation, Modeling and Performance tuning. Should have experience in designing and developing dashboards Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Should have experience in creating roadmap to improve platform Observability Experience in leading mid-scale teams with strong communication skills Experience in Machine Learning and GCP More ❯
valuation engines. Strong knowledge of object-oriented programming, data structures, and design patterns. Familiarity with market risk, credit risk, or counterparty risk concepts. Experience with messaging systems (e.g., Solace, Kafka, or RabbitMQ) and distributed architecture. Solid understanding of multi-threaded and low-latency system design. Exposure to quant libraries, risk factor decomposition, or sensitivities is a strong plus. More ❯
Modular provisioning, testing, and deployment patterns Kubernetes: Workload orchestration and container management CI/CD: GitHub Actions or Azure DevOps pipelines with end-to-end automation Event-Driven Architecture: Kafka or similar messaging systems Monitoring & Observability: Azure Monitor, Open Telemetry, Prometheus etc. Secure-by-Design Practices: Policy as Code, automated validation, compliance controls Nice to Haves Experience in regulated More ❯
A culture of engineering excellence driven by mentoring and high-quality practices. Preferred Experience Databricks in a SaaS environment, Spark, Python, and database technologies. Event-driven and distributed systems (Kafka, AWS SNS/SQS, Java, Python). Data Governance, Data Lakehouse/Data Intelligence platforms. AI software delivery and AI data preparation. More ❯
Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good More ❯
pipelines. Define non-functional requirements and performance acceptance criteria. Analyze performance test results and production metrics to identify trends, risks, and improvements. Work with modern technologies including Docker, Kubernetes, Kafka, and NoSQL databases. Mentor junior testers and drive continuous improvement in performance testing practices. What we’re looking for: 5+ years’ experience in non-functional/performance testing for More ❯
specialising in Go to contribute to v1 builds, define architecture, and ship mission-critical backend systems from scratch. What you’ll do: Design & scale a fault-tolerant ledger (Golang, Kafka, CockroachDB/YugabyteDB) Architect distributed, multi-region infra with five-nines reliability Contribute to backend, DevOps, and CI/CD decision-making Embed AI into engineering workflows & ops Self More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Talent
based services using Java and Node.js. Build and maintain microservices and UI components, ensuring seamless integration between front-end interfaces and back-end APIs. Implement event-driven solutions using Kafka and other messaging technologies. Develop, deploy, and manage scalable applications on AWS cloud infrastructure. Work with both NoSQL (MongoDB) and SQL databases to optimize data management. Utilize Docker and … teams. Extensive hands-on experience developing modern cloud-native applications using Java and Node.js. Strong background in microservices architecture, including both front-end (UI) and API components. Proficiency in Kafka and event-driven design patterns. Deep understanding of AWS cloud services and architecture best practices. Strong experience with NoSQL (MongoDB) and SQL databases. Expertise in containerization with Docker and More ❯
on test automation, identifying test cases to be automated, writing scripts and integrating them into the development lifecycle. Engaging with the wider DevOps lifecycle tool chain, like Kubernetes and Kafka, with future DevOps opportunities for you. Collaborate with developers to improve code and streamline testing. Work with HIL and hardware interactions (no experience required). Apply below for more More ❯
on test automation, identifying test cases to be automated, writing scripts and integrating them into the development lifecycle. Engaging with the wider DevOps lifecycle tool chain, like Kubernetes and Kafka, with future DevOps opportunities for you. Collaborate with developers to improve code and streamline testing. Work with HIL and hardware interactions (no experience required). Apply below for more More ❯
microservice-based Loyalty and Benefits platform, designed to be able to handle all aspects of the Loyalty and Benefits customer experience, globally. Built using modern tools such as Golang, Kafka and Docker, there is ample opportunity to drive innovation and grow knowledge and skills as an Engineer. As a Software Engineer on an Scrum team, you will be building … at least one back-end type safe programming language (Golang Preferred) · Comfortable/experienced with back-end micro-service architecture and communication, specifically REST and asynchronous messaging services (e.g., Kafka, RabbitMQ etc.) · Comfortable/experience within a Scrum framework working with as part of a team to deliver business functions and customer journeys that are tested and automated throughout … software engineering methodology (Agile, incl Scrum, Kanban, SAFe, Test-Driven Development (TDD), Behavior Driven Development (BDD) and Waterfall) Knowledge of any or all of the following technologies is desired: Kafka, Postgres, Golang, Git, gRPC, Docker, GraphQL · Experienced in continuous integration (CI), continuous deployment (CD) and continuous testing (CT), including tools such as Jenkins, Rally and/or JIRA and More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
quality, observability, scalability and security are embedded into high-quality, high-impact releases. You'll be working with a modern, cloud native tech stack using Java, Spring Boot, AWS, Kafka and CI/CD to build highly scalable, distributed systems with 24/7 availability. Location/WFH: There's a hybrid model with two days a week work … You have advanced experience of building cloud-native, distributed systems using Java and Spring Boot You have a strong knowledge of AWS including Amazon EKS You have experience with Kafka and event driven architectures You're collaborative and pragmatic with great communication skills, you're comfortable working with other teams and business stakeholders Apply now to find out more More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Chapter 2
the operating platforms for energy retailers globally. Our tech stack is very fluid, but broadly you can find yourself working with: Typescript, Node JS, Terraform, Gitactions, Kubernetes, Docker, AWS, Kafka, GraphQL Responsibilities Contribute to the design, development, and maintenance of scalable backend services using TypeScript . Help build event-driven systems that process data reliably and in real-time. … inclusive team culture. About You Experience developing backend applications — ideally using TypeScript, or a strong interest in learning it. Some familiarity with event-driven systems and message queues (e.g. Kafka, SQS). Hands-on experience with cloud technologies — AWS and serverless a plus. Comfortable working with databases (SQL or NoSQL) and understanding trade-offs. A collaborative team member who More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Opensourced® Agency
pipelines Leading initiatives in infrastructure-as-code, CI/CD, and platform monitoring Enabling robust security practices across DevOps delivery teams Core Skills & Experience Strong experience with AWS MSK (Kafka) and secure AWS environments Proven production experience with Kubernetes, Docker, and Helm Proficiency with Terraform and CI/CD pipelines (Drone/GitLab) Excellent understanding of Kafka internals More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Client Server
Senior Data Engineer (AWS Kafka Python) London/WFH to £85k Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills? You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role. As a Senior Data Engineer you will … design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You’ll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions. … financial services environments You have expertise with AWS including Lake formation and transformation layers You have strong Python coding skills You have experience with real-time data streaming using Kafka You’re collaborative and pragmatic with excellent communication and stakeholder management skills You’re comfortable taking ownership of projects and working end-to-end You have a good knowledge More ❯
pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI/CD pipelines (GitHub Actions or similar) to manage infrastructure and automate deployments. (For … Expected Proven experience delivering end-to-end data pipelines in Databricks and Spark environments. Strong understanding of data modelling, schema evolution, and data contract management. Hands-on experience with Kafka, streaming architectures, and real-time processing principles. Proficiency with Docker, Terraform, and cloud platforms (AWS, GCP, or Azure) for scalable data infrastructure. Demonstrated ability to lead, coach, and elevate More ❯
looking for a Lead Site Reliability Engineer to support and scale their Video on Demand (VOD) infrastructure. You’ll work across modern tech stacks including AWS, GCP, Cassandra, and Kafka, helping deliver reliable, high-performance systems used by millions. What you’ll do Lead project delivery while supporting day-to-day operations and incident management Build and manage infrastructure … functional environment What you’ll bring Strong Linux administration skills (Ubuntu preferred) Hands-on experience with AWS and GCP Proficiency in Terraform, Ansible, Jenkins, or GitLab CI Knowledge of Kafka, Cassandra, and relational or NoSQL databases Scripting skills in Python, Bash, Go, or Java Familiarity with monitoring tools like Prometheus, Nagios, or Icinga Understanding of networking fundamentals and virtualisation More ❯
researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader data … programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering, or More ❯