Resource Management, and Quality Management; Agreement Process Area - Acquisition and Supply. Notes: Experience in production dataflow implementation, verification, and monitoring, with a deep understanding of data processing frameworks, especially Kafka and Nifi. ( Nifi limited to the point and click UI is insufficient ) Proficient in programming in Python and Java is desired. Extensive experience in Tier 2 and 3 support More ❯
Mondays, Wednesdays, and Thursdays. About the role As a Product Manager at Conduktor, you will be instrumental in solving customer problems for organisations working in the data streaming space (Kafka), driving growth in adoption of our Enterprise platform. You will partner with Product & Engineering leadership to extend our platform capabilities with a focus on data security and observability. This More ❯
Mondays, Wednesdays, and Thursdays. About the role As a Product Manager at Conduktor, you will be instrumental in solving customer problems for organisations working in the data streaming space (Kafka), driving growth in adoption of our Enterprise platform. You will partner with Product & Engineering leadership to extend our platform capabilities with a focus on data security and observability. This More ❯
to build the financial ecosystem of the future. We're evolving towards: A Cloud-Native, DevOps-First Culture - Moving towards a fully cloud-hosted, automated platform built with Kubernetes, Kafka, and Infrastructure as Code (IaC). A Real-Time Financial Ecosystem - Shifting from data at rest to data in motion, embracing event-driven architecture to power the real-time … the API-our goal is to transition to a more scalable, cloud-native architecture while maintaining stability. As a Senior Developer, you'll be working with .NET, Azure, and Kafka, ensuring that applications are optimised for cloud deployment and aligned with modern development practices. You'll assess the right .NET versioning strategy, manage dependency transitions (e.g., moving from MSMQ … to Kafka), and ensure smooth migration of libraries and frameworks. Along with modernisation, you'll also be working on new features rollout in the existing system or new projects. What You'll Be Doing: Leading a team of 2-4 developers. Improving performance of API platform. Modernising and containerising existing applications for Azure deployment. Evaluating and implementing the right More ❯
Cycle (SDLC). A primary focus will be on design, development & maintenance of components under Price Master Central systems which would require expertise in Java, Spring, SQL, API development, Kafka, Kubernetes, and Cloud technologies. You will guide & mentor team of engineers, providing technical expertise and direction, and ensure adherence to best practices. A strong understanding of system and enterprise … Collections, I/O), Spring Boot, REST, GraphQL web services Sound understanding of SQL/PL-SQL on Oracle DB Extensive working knowledge on container platform based on Kubernetes, Kafka, Redis Experience with Unix commands, shell scripting. Strong understanding of Design patterns and Architectural principles Familiarity with standard data structures and algorithms. Experience using the following tools - JIRA, Harness More ❯
Responsibilities: • Develop and maintain Java microservice web applications. • Work closely with tech lead on technical decisions and product roadmap. • Resolve challenging software engineering design problems. • Interact with the customer to resolve issues and propose new features. • Support testing and deployment More ❯
will: Partner globally with sponsors, users, and engineering colleagues across divisions to create end-to-end solutions. Learn from experts and mentor junior members. Leverage data-streaming technologies including Kafka CDC, Kafka topics, EMS, Apache Flink. Innovate and incubate new ideas. Work on a broad range of problems involving large data sets, real-time processing, messaging, workflow, and More ❯
Data Analytics Consultant is an emerging leader with previous Data Analytics experience. The ideal candidate will have experience of design, development and operations using services like Amazon Kinesis, ApacheKafka, Apache Spark, Amazon SageMaker, Amazon EMR. The Data Analytics Consultant is comfortable rolling up their sleeves to design and code modules for infrastructure, application, and processes. About the team More ❯
stack in designing and building data & AI solutions Experience with data modeling, ETL processes, and data warehousing Knowledge of big data tools and frameworks such as Spark, Hadoop, or KafkaMore ❯
Ags, Softs, LNG etc) Any experience or exposure to Endur ETRM (any version) would be useful but this can be learnt Previous data analytics experience, with any experience of KAFKA (data messaging system) being useful Strong SQL skills Strong academics, including a bachelor’s degree (or above) from a leading university You will need a British or Irish passport More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Bramwith Consulting
Ags, Softs, LNG etc) Any experience or exposure to Endur ETRM (any version) would be useful but this can be learnt Previous data analytics experience, with any experience of KAFKA (data messaging system) being useful Strong SQL skills Strong academics, including a bachelor’s degree (or above) from a leading university You will need a British or Irish passport More ❯
in a Technical Product Management role Demonstrable experience in collaborating with UX and Engineering teams producing enterprise-grade Web User Interfaces. Knowledge of React and event stream processing via Kafka/Flink. Strong technical understanding of System Architectures, Programming Interfaces and industry connectivity protocols such as FIX, HTTP, MQ, SFTP and SWIFT. Hands-on experience of product usage analytical More ❯
.iso and RedHat package Managers (rpm) DESIRED QUALIFICATIONS: (U) Three (3) years' experience configuring, troubleshooting, monitoring, and administering a messaging service (such as Java Messaging Service, TIBCO, ActiveMQ, Artemis, Kafka) (U) Other (U) CWIP: IAT Level I Certification, IA Baseline Certification in Linux, Unix, Windows, or Cisco OS operating system All SA positions require on-call rotation support AMF More ❯
help maintain and raise our standards. You'll be working in a cross functional product squad, with Product, Design and Data Science on a modern tech stack, including: Java, Kafka, Postgres as well as integrations with our external providers. Our cross-functional teams collaborate with stakeholders across the business to help deliver value to our customers. Salary We are More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Corelight, Inc
products operate correctly Perform validation testing of Corelight products Provide ongoing, informal, knowledge transfer Collaborate with product management on product features/integrations Work with back-end tools like Kafka and Logstash Documenting the process for importing of data (MISP, Intel, etc) Developing custom content for threat hunting use cases as defined by the customer Developing playbooks for SOC More ❯
least 5+ years of experience building highly reliable and scalable backend services in Go Experience with RDBMSs such as Postgres, MySQL etc. Experience with message-brokers technologies such as Kafka, RabbitMQ etc, working within event-driven architectures. You have excellent knowledge of the best practices in designing, developing and deploying those services in a cloud environment You have experience More ❯
infrastructure that they all run on. What you'll do Build Develop production-grade backend services in Scala, powering payments, core banking, and rewards. Develop event-driven systems with Kafka and integrate multiple internal/external services. Ship features end-to-end, from concept through to live deployment, with measurable customer impact. Design Contribute to the architecture of our … Scala Proficiency : Deep understanding of Scala, including its functional programming features and ecosystem. Expertise in RESTful APIs : Have a solid background in designing and implementing RESTful APIs. Familiarity with Kafka or Event-Driven Systems : Bring experience with Kafka or similar event-driven architectures, demonstrating an understanding of their dynamics and integrations. Micro-services and Distributed Systems Experience : Expertise More ❯
test, configuration and deployment of a market platform hosted on the Cloud (GCP) and based on a Kubernetes cluster.Your main role will be to:- middleware configuration and provisioning such kafka, elasticsearch, postgresql in a kubernetes context- Develop and maintain DevOps automation- Bring DevOps technical expertise organization wide Responsibilities Manage production environments including configuration, troubleshooting, performance turning, scaling of the … components such as: kafka, elasticsearch, PostgreSQL;Manage, backup and restore data at scale;Contribute to environment automated provisioning through terraform in GCP;Managing software development infrastructure (gitlab, SCA, artifact/docker registries);Maintaining and improving CICD pipelines in gitlab (from build, through automated tests up to environment deployment);Manage environments configuration and services deployment based on Kubernetes cluster (helm … degree in computer science with significant professional experiences in SaaS, ideally in the energy or finance areaMinimum 4+ years of professional experiences as a DevOps/SREProduction experience with kafka, PostgreSQLSignificant DevOps experiences with cloud hosted solutions with strong requirement in terms of availability, scalability and security.Used to work in an Agile/Scrum frameworkGood communication and presentation skills More ❯
data from diverse sources, transform it into usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), leveraging cloud … for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies … and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory More ❯
and client-facing engineering teams. As part of the Product Engineering team, you'll help shape the infrastructure backbone of our proprietary platform, working with technologies like Terraform, Kubernetes, Kafka, and GitHub Actions, all within a high-security, regulated context. Key Responsibilities Design, build, and operate infrastructure on Azure, leveraging Terraform and Terragrunt as part of our platform automation … internal and client-facing infrastructure documentation and support materials Cloud Platforms: Azure (primary), with some AWS, GCP exposure beneficial CI/CD: GitHub Actions, GitLab CI Messaging/Eventing: Kafka Architectures: Microservices, event-driven, distributed systems What We're Looking For Minimum 5-8yrs of proven experience in a DevOps engineering capacity, with strong exposure to Azure-native … VNet, Key Vault, AKS) Deep hands-on knowledge of Terraform and Terragrunt for cloud infrastructure automation Practical expertise in deploying, maintaining, and troubleshooting Kubernetes clusters in production Familiarity with Kafka and event-driven or streaming-based systems Experience building and managing CI/CD pipelines using GitHub Actions and/or GitLab Strong problem-solving, diagnostic, and debugging skills More ❯
for a modern customer experience platform. Key Responsibilities: Develop and enhance scalable backend features using Golang (preferred) or another typed backend language Build microservices using REST and asynchronous messaging ( Kafka , RabbitMQ, etc.) Collaborate within Agile/Scrum teams to design and deliver high-quality, testable code Work on CI/CD pipelines with tools like Jenkins , Git, and automated … architecture , and NoSQL/Postgres Familiarity with CI/CD tools such as Jenkins and Git Clear communicator with strong teamwork and problem-solving skills Desirable Skills: Experience with Kafka, GraphQL, gRPC, Docker Test-driven development (TDD/BDD) Agile methodologies (Scrum, Kanban, SAFe) Exposure to cloud-native patterns and services If you're passionate about building high-performing More ❯
annum + Pension, Health, Gym Posted: 27/08/2024 Description Senior Java Developer, Java Software Engineer, Java 17, Spring Boot, Maven, Git, GitHub Actions, SQL, Microservice architecture, Kafka Streams, Kubernetes, React, Typescript - Technical lead on project - Fully remote if needed - £70K About the Role As the Senior Java Software Engineer, you'll be a key player in … Skills and Experience A solid background in software development Experience in senior roles within teams or projects Expertise in Java, Spring Boot, Maven, Git, GitHub Actions, SQL, Microservice architecture, Kafka Streams, Kubernetes Knowledge of software design principles Primarily a backend role, but front-end experience in React and/or Typescript is a plus Experience developing robust, performant APIs More ❯
and extend the full stack of the TRSS solutions suite to deliver global data using advanced search, big data ingestion and analytical processing using tools like MongoDB, Elastic, and Kafka as well as building leading edge web applications. You will work closely with the other members of the TRSS analytical staff and development to capture customer requirements and design … government contract/agency or department of Federal Government requirements The following skills and tools are preferred, but not required: AWS experience strongly desired Neo4J or other graph technology Kafka Grails, Groovy ElasticSearch More ❯