AWS services like S3, EMR, and technologies like Terraform and Docker. Know the ins and outs of current big data frameworks like Spark or Flink, but this is not an absolute requirement - you’re a quick learner! This role is open to individuals based in or willing to relocate More ❯
containerisation and orchestration (Docker and Kubernetes). Infrastructure as Code (Terraform or similar). Experience with at least one distributed data-processing framework (Spark, Flink, Kafka, etc.). Familiarity with different storage solutions (e.g., OLTP, OLAP, NoSQL, object storage) and their trade-offs. Product mindset and ability to link More ❯
infrastructure and application level. You have knowledge of cloud based ML solutions from GCP or AWS. Experience with streaming data processing frameworks such as Flink, Beam, Spark, Kafka Streams. Experience with Ansible, Terraform, GitHub Actions, Infrastructure as Code, AWS or other cloud ecosystems. Knowledge/interest in payment platforms More ❯
OAuth, JWT, and data encryption. Fluent in English with strong communication and collaboration skills. Preferred Qualifications Experience with big data processing frameworks like ApacheFlink or Spark. Familiarity with machine learning models and AI-driven analytics. Understanding of front-end and mobile app interactions with backend services. Expertise in More ❯
RabbitMQ, Pulsar, etc.). Experience in setting up data platforms and standards, not just pipelines. Experience with distributed data processing frameworks (e.g., Spark or Flink). About the Team J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent More ❯
RabbitMQ, Pulsar, etc.). Experience in setting up data platforms and standards, not just pipelines. Experience with distributed data processing frameworks (e.g., Spark or Flink). About the Team J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
volume data/low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines Integration Skills REST and Graph APIs (Desirable) Serverless More ❯
end solutions, learn from experts, leverage various technologies depending on the team including; Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, Kafka, Kubernetes, ApacheFlink, Kafka CDC be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing with More ❯
/CD pipeline to deliver high-velocity, production-ready systems This Role Requires: Strong Java skills Experience with live-streaming technologies such as Kafka, Flink or Kinesis Solid understanding of event-driven architecture Experience with CI/CD practices and tools Familiarity with microservices and modern cloud-based platforms More ❯
with large datasets (structured and unstructured) Used a range of open-source frameworks and development tools, e.g. NumPy/SciPy/Pandas, Spark, Kafka, Flink Working knowledge of one or more relevant database technologies, e.g. Oracle, Postgres, MongoDB, ArcticDB. Proficient on Linux Competitive salary + generous bonuses Extra perks More ❯
entire spectrum of AWS Services - Storage (Redshift Data Shares, S3 data lakes), Orchestration (Step Functions, Glue, and Internal Java Based Orchestration Tools), Processing (Spark & Flink - KDA), Streaming services (AWS Kinesis) and real-time large scale event aggregation stores. Build and scale our ingestion pipeline for scale, speed, reliability, and More ❯
leeds, west yorkshire, yorkshire and the humber, United Kingdom
Pyramid Consulting, Inc
maintain smooth operations. Integration & Data Processing: Integrate Kafka with key data processing tools and platforms, including Kafka Streams , Kafka Connect , Apache Spark Streaming , ApacheFlink , Apache Beam , and Schema Registry . This integration will facilitate data stream processing, event-driven architectures, and continuous data pipelines. Cross-functional Collaboration: Work More ❯
with Docker, Kubernetes, or other container orchestration tools. Familiarity with observability tools (e.g., New Relic) for tracking usage and service health. Experience with Kafka, Flink, or IoT streaming technologies. Background in financial services or other regulated industries. What can you expect? Competitive salary: £120,000 per annum - reflective of More ❯
with Docker, Kubernetes, or other container orchestration tools. Familiarity with observability tools (e.g., New Relic) for tracking usage and service health. Experience with Kafka, Flink, or IoT streaming technologies. Background in financial services or other regulated industries. What can you expect? Competitive salary: £120,000 per annum - reflective of More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Yelp USA
services that power features like Review Insights, Business Summaries, and Conversational Search. Build real-time and batch data processing pipelines using technologies like Kafka, Flink, and Spark. Ensure high availability, performance, and reliability of backend systems in a production environment. Contribute to the team's engineering best practices and More ❯
Have the potential to build and lead a distributed team Experience with Machine Learning/Data Science Know our tech stack: Python, Scala, Spark, Flink, Kafka, Kubernetes, Databricks and AWS services Previous experience with git, CI and CD tools Passionate about internal quality, good code and effective technical practice More ❯
yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and ApacheFlink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and knowledge sharing. … Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with ApacheFlink or other stream processing frameworks is a plus. Desired Skills: Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio . Frontend Knowledge More ❯
yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and ApacheFlink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and knowledge sharing. … Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with ApacheFlink or other stream processing frameworks is a plus. Desired Skills: Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio . Frontend Knowledge More ❯
Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean, efficient More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Solytics Partners
Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean, efficient More ❯
end solutions, learn from experts, leverage various technologies depending on the team including; Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, Kafka, Kubernetes, ApacheFlink, Kafka CDC be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing with More ❯
Grow with us. We are looking for a Machine Learning Engineer to work along the end-to-end ML lifecycle, alongside our existing Product & Engineering team. About Trudenty: The Trudenty Trust Network provides personalised consumer fraud risk intelligence for fraud More ❯
Grow with us. We are looking for a Machine Learning Engineer to work along the end-to-end ML lifecycle, alongside our existing Product & Engineering team. About Trudenty: The Trudenty Trust Network provides personalised consumer fraud risk intelligence for fraud More ❯
have some of the following and are eager to learn more, we want to hear from you: Experience with Kafka Streams and/or Flink Experience building production code with Java and Spring Experience with infrastructure automation tools Experience building and operating distributed systems at scale Interested? Find out More ❯
and Streaming services such as Kinesis Data Streams, Simple Queue Service (SQS), Simple Notification Service (SNS), Amazon MQ, and Amazon Managed service for ApacheFlink (MSF). We are looking for a technical expert who brings a mix of operations and networking expertise and shares our passion to change More ❯