Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
APIs, GraphQL, and microservices architecture. Proficiency with DevOps tools and CI/CD pipelines (Docker, Kubernetes, Jenkins, Bitbucket Pipelines). Experience in data processing (Kafka, Redis streams) & analytics (Superset, Elastic Search). Excellent problem-solving skills and ability to work in a fast-paced environment. Great communication skills to More ❯
Frontend Frameworks: Basic knowledge of TypeScript and ReactJS for building scalable and maintainable web applications. Message Queues: Understanding of messaging systems such as RabbitMQ, Kafka, or ActiveMQ for handling asynchronous communication. Spring Boot: Experience with Spring Boot for developing scalable, efficient applications with microservices architecture Due to the sensitivity More ❯
environments. Frontend exposure (e.g., React) – not essential but beneficial. Familiarity with Kotlin or openness to working with Kotlin-based services. Messaging systems such as Kafka or RabbitMQ. Cloud platform experience (AWS, GCP, or Azure). Understanding of DevOps and infrastructure-as-code tools (e.g., Terraform, Ansible). Exposure to More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
areas, please get in touch Spring Boot microservices React and Node.js frontend web application development Database technologies e.g. PostgreSQL Messaging technologies e.g. ActiveMQ or Kafka Continuous Integration and Continuous Deployment Cloud platforms, e.g. AWS, and Kubernetes platforms e.g. OpenShift Who you'll be working with Working in an agile More ❯
Employment Type: Permanent, Part Time, Work From Home
working in regulated environments React or frontend experience - not mandatory but useful Kotlin - experience or willingness to work with Kotlin-based services Experience with Kafka, RabbitMQ, or other messaging systems Knowledge of cloud platforms (AWS, GCP, or Azure) Familiarity with DevOps practices and infrastructure as code (Terraform, Ansible, etc. More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
Experience working in government, defence, or highly regulated industries with knowledge of relevant standards Experience with additional data processing and ETL tools like ApacheKafka, Spark, or Hadoop Familiarity with containerization and orchestration tools such as Docker and Kubernetes Experience with monitoring and alerting tools such as Prometheus, Grafana More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
team technically Experienced in designing and developing full stack solutions with Java, Spring boot and ideally React (or similar) Microservices Architecture Messaging technologies e.g Kafka, RabbitMQ OR ActiveMQ Familiar with Cloud platforms (AWS, Kubernetes, Docker) Experienced of working within an Agile Product team alongside scrum masters and product owners More ❯
and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like Ansible, Chef, Puppet, etc. Experience with test-driven More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Backed Technologies: Highly experienced in developing Java code, along with deep expertise within Spring boot OR Quarkus/GraalVM, Hands on with Messaging Systems (Kafka, RabbitMQ, Google Pub/Sub), SQL Databases CI/CD Pipelines and Agile development: Knowledge of continuous integration and delivery tools/practices, Scrum More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with ApacheKafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning More ❯
experience architecting large-scale distributed systems, particularly on AWS. The ideal candidate will bring deep expertise in designing secure, scalable infrastructures using AWS, Kubernetes, Kafka, and Java-based microservices. You’ll be responsible for shaping technical strategy, leading architecture design, and mentoring teams across cloud transformation projects. A strong More ❯
at its core. What will the Lead AWS Cloud Architect be doing? Design secure, scalable AWS cloud architectures using modern patterns. Manage Kubernetes and Kafka for containerisation and real-time systems. Implement CI/CD, automation, and optimise performance and cost. Ensure cloud security, compliance, and risk mitigation. Lead More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
PA Consulting
background in API design and management, with experience in REST, SOAP, GraphQL, and related technologies. Technical Skills : Experience with middleware technologies, message brokers (e.g., Kafka, RabbitMQ), and ESB (Enterprise Service Bus) architectures. Proficiency in integration patterns, such as publish/subscribe, message queuing, event-driven architectures, and orchestration/ More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Bring Deep expertise in .NET/C#, AWS, microservices, and distributed systems. Strong leadership and mentoring skills. Proficiency with messaging systems (SNS/SQS, Kafka), CI/CD, and container orchestration (EKS/Kubernetes). Passion for craftsmanship and engineering excellence. Bonus if you have experience in healthcare/ More ❯
experience in tools like Snowflake, DBT, SQL Server, and programming languages such as Python, Java, or Scala. Proficient in big data tools (e.g., Spark, Kafka), cloud platforms (AWS, Azure, GCP), and embedding AI/GenAI into scalable data infrastructures. Strong stakeholder engagement and the ability to translate technical solutions More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
Senior Data Engineer (GCP/Kafka) Apply locations: Bristol Harbourside, London 25 Gresham Street Time type: Full time Posted on: Posted Yesterday Time left to apply: End Date: May 12, 2025 (25 days left to apply) Job requisition id: 111909 End Date: Sunday 11 May 2025 Salary Range … build scalable real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers … of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer”. Certification in ApacheKafka (CCDAK). Proficiency across the data lifecycle. Working for us: Our focus is to ensure we are inclusive every day, building an organisation that More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Weare5vtech
evolving existing products Development and maintenance of API's Key Skills: Strong experience of software development with Java EE or Jakarta EE Experience with Kafka or Cassandra for event streaming Experience with implementing Cloud infrastructure with AWS or Other cloud providers If you feel that you have the right More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
experience with Node.js and Typescript on large scale microservices architectures Extensive work on extremely high-traffic, distributed systems Good knowledge of Kubernetes and either Kafka or RabbitMQ Deep understanding of AWS and related tools Strong Computer Science fundamentals Happy working fully remote in a B2B capacity -- We make an More ❯
Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. More ❯
Join to apply for the Senior Data Engineer (GCP/Kafka) role at Lloyds Banking Group Join to apply for the Senior Data Engineer (GCP/Kafka) role at Lloyds Banking Group This range is provided by Lloyds Banking Group. Your actual pay will be based on your … and build scalable real time data applications. Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers … understating of cloud storage, networking and resource provisioning It would be great if you had... Certification in GCP "Professional Data Engineer" Certification in ApacheKafka (CCDAK) Proficiency across the data lifecycle WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation that reflects More ❯
Azure functions with Python, Azure Purview, and Cosmos DB. They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for ApacheKafka, Azure DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Preferred Education Bachelor More ❯
cloud SIEM technology such as Sentinel and Cribl Strong background in data engineering, log management, or observability platforms Experience with systems like Sentinel, Elasticsearch, Kafka, or similar Proficiency in data transformation, enrichment, and routing Solid scripting and automation skills (e.g., Python, Bash, PowerShell) Familiarity with IT infrastructure, security operations More ❯
including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership More ❯
and build scalable real time data applications. Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers …/scripting experience developed in a commercial/industry setting (Python, Java, Scala or Go and SQL). Databases & frameworks Strong experience working with Kafka technologies. Working experience with operational data stores , data warehouse , big data technologies and data lakes . Experience working with relational and non-relational databases … of cloud storage, networking and resource provisioning. It would be great if you had Certification in GCP "Professional Data Engineer". Certification in ApacheKafka (CCDAK). Proficiency across the data lifecycle. WORKING FOR US Our focus is to ensure we are inclusive every day, building an organisation that More ❯
within a commercial or industry setting, using languages such as Python, Java, Scala, or Go, along with SQL. Databases & Frameworks Strong experience working with Kafka technologies. Operational experience with related systems. Using Terraform Experience with CI/CD pipelines and associated tools/frameworks. Containerisation Good knowledge of container More ❯