Player: Ability to work effectively in a collaborative team environment, as well as independently. Preferred Qualifications: Experience with big data technologies (e.g., Hadoop, Spark, Kafka). Experience with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). more »
developing RESTful and GraphQL APls, integrating third-party services, and handling API versioning. • Expertise in event-driven architecture using message brokers such as ApacheKafka, or AWS SNS/SQS. Experience with container orchestration platforms like Kubernetes and Docker Swarm. Proficiency in at least one programming language commonly used more »
been around emitting metrics from the Calculation Engine to put more data in the hands of our clients. In processing these metrics, we use Kafka Streams for aggregation and Kafka connectors to persist the data. As a Senior Developer, you will be responsible for leading the design and … using promises/futures (e.g., CompletableFuture). Extensive experience with multi-threaded applications. Deep understanding of event-driven and streaming microservices. Extensive experience using Kafka, leveraging Kafka Connect and Kafka Streams. Experience with container technologies such as Docker, Podman, and Kubernetes, as well as package managers like more »
techniques, and platforms, with a focus on proving it works and does the right thing. Experience working with Spring, Hibernate, REST APIs, JSON, Microservices, Kafka, and Postgres. Good understanding of architecture and cloud-native principles. Good grasp of algorithms and data structures. Familiar with test-driven development approach. Experience more »
the latest advances in GenAI (bonus points for hands-on experience) Solid experience with streaming and batch ETL solutions using data processing tools like Kafka, Pub-Sub, VertexAI, BigQuery, Dataform, Postgres, or comparable technologies Hands-on experience developing cloud-based solutions across GCP, AWS, or Azure, with a command more »
In-depth knowledge in using tools such as Terraform, Helm, kubectl, Hashicorp vault. Deep understanding of event-driven and streaming microservices. Extensive experience using Kafka and Cloud-native messaging systems (AWS SQS/SNS or Google Pub/Sub or Equivalent). Familiar with asynchronous programming using promises/ more »
/GraphQL). Expertise in working with databases (e.g., MySQL, PostgreSQL, MongoDB, Redis). Familiarity with message queues and event-driven architecture (e.g., RabbitMQ, Kafka). Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP). Proficient in containerization and orchestration (e.g., Docker, Kubernetes more »
efficient data pipelines for both batch and real-time processing. * Hands-on experience with data transformation and data processing frameworks (e.g., Apache Spark, ApacheKafka). * Solid understanding of relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., DynamoDB, MongoDB). * Familiarity with data security, governance, and compliance standards more »
Central London, London, United Kingdom Hybrid / WFH Options
83zero Limited
/Neo4j/Elastic, Google Cloud Datastore. BigQuery and Data Studio/Looker. Snowflake Data Warehouse/Platform Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc. Experience and knowledge more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
areas, please get in touch Spring Boot microservices React and Node.js frontend web application development Database technologies e.g. PostgreSQL Messaging technologies e.g. ActiveMQ or Kafka Continuous Integration and Continuous Deployment Cloud platforms, e.g. AWS, and Kubernetes platforms e.g. OpenShift Benefits Motability Operations is a unique organisation, virtually one of more »
Employment Type: Permanent, Part Time, Work From Home
Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience, such more »
4+ years of visual/UI testing and Rest Assured/Open API testing. * Experience with event-driven services, message queues, and event brokers (Kafka). * Proficient in performance testing tools (JMeter, K6, Neoload, LoadRunner). * Skilled in writing SQL/NoSQL queries for data verification. * In-depth knowledge more »
where you can dive into the latest and greatest technologies, including: Java, Python, TypeScript, JavaScript AWS, Azure Spring Boot, React, Angular Kubernetes, Docker, Microservices, Kafka ???? Why Choose Us? ???? Skill Development & Career Growth: Learn new skills and advance your career with our comprehensive training programs and mentorship opportunities. Flexibility & Work more »
such as Spring Boot, Docker, Kubernetes, and AWS. Manage CI/CD pipelines with Terraform, Jenkins, Ansible, and BitBucket. Collaborate on queuing solutions using Kafka and SQS, and set up advanced monitoring using the ELK stack. About You To succeed as a Java Engineer , you’ll need: A minimum more »
of Python programming is advantageous. Familiarity with modern hardware architectures and cloud infrastructure platforms. Additional Desirable Skills or Experience: Experience with stream processing technologies (Kafka, Flink). Proficiency in other programming languages such as Java, Scala, or F#. Knowledge of Kubernetes/OpenShift/Docker and helm templates. Experience more »
ECS, EKS, and EventBridge. Programming Languages : .NET 7, JavaScript, and Python. Infrastructure as Code : Terraform. Web Experience : React. CI/CD : GitHub Actions. Messaging : Kafka and SNS/SQS. What Youll Bring: Extensive experience as a hands-on leader in an agile environment. A strong background in software engineering more »
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications more »
CD pipelines (Azure DevOps/GitHub Actions) • Tackle complex Java, Python, and Typescript codebases with problem-solving What You Bring: • Proven experience with gRPC, Kafka, Kubernetes (Istio preferred), Redis, MongoDB, and Avro/JSON/protocol buffers • Strong understanding of authentication/authorization (mTLS, OIDC, RBAC/ABAC) • Hands more »
direction to a growing team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilizing technologies such as EKS, S3 more »
logging solutions using the ELK stack. Manage the CI/CD pipeline with BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka and SQS for queuing solutions and scheduling with Jenkins/Ansible. Use Cucumber, JUnit, Selenium, and Postman to ensure comprehensive testing across all aspects more »
Cloud technologies like AWS could be advantageous Experience with Database technologies like PostgreSQL could be advantageous Experience with Messaging technologies like ActiveMQ and ApacheKafka could be advantageous Key Roles & Responsibilities Work closely with the Development Manager, Project Manager, and a team of developers to deliver Java components to more »
Edinburgh, City of Edinburgh, United Kingdom Hybrid / WFH Options
Cathcart Technology
skills; ideally with Java ** Software Architecture ** Cloud Services (AWS, Azure or GCP) ** Proven experience in a similar role The following is highly desirable; ** ApacheKafka ** Front-end experience (ReactJS with TypeScript) ** DevOps tooling (Docker, Kubernetes, Terraforms) They've got custom-built offices in central Edinburgh, which includes pool tables more »
AI models using tools like MLflow, Kubeflow, TensorFlow Serving, and Seldon. Big Data Engineering: Develop high-performance data pipelines with tools like Apache Spark, Kafka, and Hadoop. Generative AI: Leverage GPT, DALL-E, and GANs to enhance user experiences and innovate content generation. Transformers and Architectures: Utilize advanced transformer more »