to some of the following technologies (or equivalent): Apache Spark, AWS Redshift, AWS S3, Cassandra (and other NoSQL systems), AWS Athena, Apache Kafka, ApacheFlink, AWS, and service-oriented architecture. What you'll get: Full responsibility for projects from day one, a collaborative team, and a dynamic work environment. More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
the Risk Management Framework (RMF) process Proficiency in system design and meticulous documentation Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs. Experience in More ❯
testing and validation frameworks for ML models. Knowledge of orchestration tools (Airflow, Prefect) for data workflows. Understanding of real-time data streaming technologies (Kafka, Flink, etc.). Education/Experience include: Bachelor's degree in a related field preferred. Position Type: Full time position Clearance Type: Requires a current More ❯
OAuth, JWT, and data encryption. Fluent in English with strong communication and collaboration skills. Preferred Qualifications Experience with big data processing frameworks like ApacheFlink or Spark. Familiarity with machine learning models and AI-driven analytics. Understanding of front-end and mobile app interactions with backend services. Expertise in More ❯
RabbitMQ, Pulsar, etc.). Experience in setting up data platforms and standards, not just pipelines. Experience with distributed data processing frameworks (e.g., Spark or Flink). About the Team J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent More ❯
manage infrastructure using Infrastructure as Code (IaC) Strong foundation in data structures, algorithms, and software development practices Proficient in data processing frameworks such as Flink or Spark Ideally you also have experience Experience working in Agile environments (Scrum or Kanban) Familiarity with CI/CD pipelines and Git DevOps More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Enlighten, an HII - Mission Technologies Company
related field; or High School Diploma or equivalent and 9 years relevant experience. Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs Experience in More ❯
/CD pipeline to deliver high-velocity, production-ready systems This Role Requires: Strong Java skills Experience with live-streaming technologies such as Kafka, Flink or Kinesis Solid understanding of event-driven architecture Experience with CI/CD practices and tools Familiarity with microservices and modern cloud-based platforms More ❯
with multiple of the following: Analytical development: Machine Learning, Stream, Batch Virtualization Environments: Kubernetes, Containers (Docker, Containerd), Cloud (AWS/HCI) Streaming Frameworks: Kafka, Flink MongoDB, ArangoDB, Redis, PostgreSQL Desired skills: ElasticSearch Additional information: Work performed in contractor facility in Annapolis Junction Both low and high side access Potential More ❯
maintain smooth operations. Integration & Data Processing: Integrate Kafka with key data processing tools and platforms, including Kafka Streams , Kafka Connect , Apache Spark Streaming , ApacheFlink , Apache Beam , and Schema Registry . This integration will facilitate data stream processing, event-driven architectures, and continuous data pipelines. Cross-functional Collaboration: Work More ❯
relational databases like PostgreSQL , MySQL , MSSQL and analytical DBs . Working knowledge of ClickHouse or similar data warehouse technologies. Familiarity with tools such as Flink , Redis , RabbitMQ , Superset , Cube.js , Minio , Grafana - nice to have. Proven track record of leading engineering teams , providing mentorship and technical direction. Experience optimizing solutions More ❯
and microservices. Experience with message queue technologies (e.g., Kafka), with a strong understanding of event-driven architectures. Knowledge of workflow engines such as ApacheFlink is a plus. Strong focus on security principles, including IAM policies, data encryption, and compliance. Experience with CI/CD pipelines and infrastructure-as More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Yelp USA
services that power features like Review Insights, Business Summaries, and Conversational Search. Build real-time and batch data processing pipelines using technologies like Kafka, Flink, and Spark. Ensure high availability, performance, and reliability of backend systems in a production environment. Contribute to the team's engineering best practices and More ❯
entire spectrum of AWS Services - Storage (Redshift Data Shares, S3 data lakes), Orchestration (Step Functions, Glue, and Internal Java Based Orchestration Tools), Processing (Spark & Flink - KDA), Streaming services (AWS Kinesis) and real-time large scale event aggregation stores. Build and scale our ingestion pipeline for scale, speed, reliability, and More ❯
yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and ApacheFlink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and knowledge sharing. … Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with ApacheFlink or other stream processing frameworks is a plus. Desired Skills: Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio . Frontend Knowledge More ❯
Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean, efficient More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Solytics Partners
Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean, efficient More ❯
end solutions, learn from experts, leverage various technologies depending on the team including; Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, Kafka, Kubernetes, ApacheFlink, Kafka CDC be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing with More ❯
AND EXPERIENCE The successful Senior Data Platform Engineer will have the following skills and experience: Java Spring Boot Cloud (ideally AWS) Kafka (Kenisis or Flink will be considered) BENEFITS The successful Senior Data Platform Engineer will receive the following benefits: Salary up to £85,000 – depending on experience Semi More ❯
start to finish. Key job responsibilities Build and operate our foundational data infrastructure using AWS services such as Redshift, S3, Step Functions, Glue, Spark, Flink, Kinesis, and large-scale event stores. Develop and scale our ingestion pipelines for speed, reliability, and multi-tenancy, supporting various data sources and formats More ❯
Grow with us. We are looking for a Machine Learning Engineer to work along the end-to-end ML lifecycle, alongside our existing Product & Engineering team. About Trudenty: The Trudenty Trust Network provides personalised consumer fraud risk intelligence for fraud More ❯
Skills: AI/ML Engineering, MLOps, AI/ML platform architecture Generative AI, Langchain, SageMaker, Python Experience with big data processing frameworks (PySpark, ApacheFlink, Snowflake) Strong leadership and communication skills Responsibilities: Lead AI/ML platform strategy, scalability, and enterprise adoption Guide POCs and deployments of GenAI/ More ❯
ideally SRE or Production Engineering Experience with large scale distributed systems Deep understanding and experience in one or more of the following: Hadoop, Spark, Flink, Kubernetes, AWS Experience working and leading geographically distributed teams and implementing high level projects and migrations Preferred Qualifications BS degree in computer science or More ❯
and Streaming services such as Kinesis Data Streams, Simple Queue Service (SQS), Simple Notification Service (SNS), Amazon MQ, and Amazon Managed service for ApacheFlink (MSF). We are looking for a technical expert who brings a mix of operations and networking expertise and shares our passion to change More ❯
to assess the performance of these algorithms will be crucial. You'll also work on implementing and optimizing stream processing solutions using technologies like Flink and Kafka. Qualifications 3 - 5 years of software development experience and a minimum of 2 internships with direct experience in building and evaluating ML More ❯