source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker more »
monitoring solutions ● Experience supporting Kubernetes-based distributed applications, or an understanding of Kubernetes fundamentals ● Experience with pub-sub, messaging and streaming solutions like Pulsar, Kafka ● Experience using APIs and understanding app development lifecycle with a language or framework based on Java, Python or Go would be preferred ● Experience/ more »
EKS, Cloudwatch, Cloudfront. Works well in a team and with minimal supervision CI/CD pipelines setup and config Desirable: The ELK stack ApacheKafka Spring Security and OAuth2 flows React React Native Keycloak Camunda OpenAPI/Swagger Nginx setup and config Who you are: You have a ruthless more »
data strategies and engineering practices.They specialise in providing cutting-edge data solutions to various clients across the globe. Their tech stack includes Kotlin, React, Kafka, Kubernetes, and AWS, and they are on a mission to make data work smarter for businesses across various industries.110,000 depending on experience.Requirements:Experience more »
architecture and engineering, ideally on Microsoft Azure (Kubernetes Service, Container Apps, App Service, Functions, Event Grid and Service Bus) Experience with Messaging platforms, ideally Kafka Experience with Containers, ideally Kubernetes Experience with React and TypeScript/JavaScript Experience with CSS3 and HTML5 Architectural design patterns and how these are more »
the business requires to operate. Skills & Experience Experience in coding Python with Django framework Experience with microservices and using related tools – e.g. Docker, AWS, Kafka, Lambda. RESTful API development and management, Transactional management. Databases – e.g. PostgreSQL, MongoDB and SQL. Experience with data science libraries is highly desirable. Performance Tuning. more »
Belfast Metropolitan Area, United Kingdom Hybrid / WFH Options
Enso Recruitment
a broad array of technologies so experience with any of the following is a bonus! Java Testing Frameworks (JUnit, Spock, Cucumber, TestNG), Gradle, Mockito, Kafka, Angular, Docker, CI/CD, PostgreSQL, Splunk, Sonarqube, Selenium Salary Package Base Salary: Up to £85k Performance Bonus Healthcare Pension: 4-8% Holidays more »
hedge fund industry. Technical Skills: Proficiency in Python and SQL. Experience with relational and NoSQL databases. Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka). Understanding of financial markets and trading systems. Strong analytical, problem-solving, and communication skills. Familiarity with DevOps tools and practices. This is an more »
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment more »
management of the platforms, working closely with our stakeholders and other teams. We would like to speak to people who have; Java OR Python Kafka K8S, Docker Lead the end-to-end technical delivery of large complex projects with multiple internal and external stakeholders, identifying dependencies, risks, issues, costs more »
Proficiency in cloud platforms (AWS, GCP, Azure) and containerization (Docker) is a game-changer. Familiarity with caching technologies (Redis) and messaging/streaming tech (Kafka, RabbitMQ) will set you apart. Why You Belong Here Join a fast-growing FinTech company with a vibrant start-up culture and global reach. more »
greenfield payments platform The candidate: 4+ years experience as a Java Developer. Strong experience in Spring, Spring Boot. Experience in AWS Strong experience with Kafka and Microservice based design Experience working with a modern tech stack Expert in domain driven design and test automation CI/CD Pipeline experience more »
Knowledge and understanding of OTC products (Interest Rate Swaps, Variance Swaps, CDS, etc.) bookings. Familiarity with C++ and Big Data tools such as Spark, Kafka, Elastic. Join us and be part of a team that values innovation, collaboration, and excellence. Take your career to new heights with a leading more »
and manage multiple projects simultaneously. Experience working with the AWS + Services (Sagemaker or equivalent) Desirables: A financial technology background Experience with SQS or Kafka Prior work with CI/CD tools Prior work with a transformation tool The interview process will consist of 3 stages for this position more »
/Knowledge/Experience: Good experience of defining RFPs and RFIs. Hands on experience in Java/Spring boot or node.js, express framework, Git, Kafka and CI/CD Pipelines. Strong experience in AWS Cloud. AWS PaaS such as: KMS, Secrets Manager, AWS DocumentDB, RDS Postgres, SQS. Automated testing more »
frameworks (Flask, Django, FastAPI etc)Knowledge of relation database technologies e.g. Oracle, PostgreSQLExperience developing applications using React or experience with event driven services (e.g. Kafka)Experience with development utilising SDLC tools - Git, JIRA, Artifactory, Jenkins/TeamCity, OpenShift/KubernetesAnalytical thinker, team player and possess strong communication skills with more »
creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, ApacheKafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen country of residence more »
in Kubernetes, Docker, Jenkins, and Ansible. Proficiency with monitoring tools like Prometheus and Grafana. Experience with Terraform and cloud formation scripting. Knowledge of ApacheKafka for stream processing. Familiarity with various databases including MySQL, PostgreSQL, Redis, and Solr. Proficient with git and git workflows for version control. Experience in more »
well as experience creating, deploying, and managing containers in message brokering and real-time data streaming Hands-on experience with technologies such as ApacheKafka and Azure Service Bus, capable of building and integrating scalable streaming applications. On offer: Remote working setup Annual leave Working week Wellness incentive Equipment more »
implementation and performance tuning Hadoop/Spark implementations Experience Apache Hadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code more »
with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with ambiguity and more »
per day. This will be working out of their London based office and 2-3 days per week onsite. Technical skillset: Golang Kafka Kubernetes SQL CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience, ideally with AWS Bonus: Payments experience Python experience For more »