Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like Apache Spark and Flink for scalable performance. Infrastructure Automation: Implement Infrastructure as Code (IaC) using tools like Terraform and Ansible. Leverage cloud-native services (AWS, Azure) to streamline More ❯
AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What you can expect from us: Opportunity for annual bonuses Medical Insurance Cycle to work scheme More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
orchestration (e.g., Apache Airflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop) Familiarity with database technologies (PostgreSQL, MySQL, or NoSQL solutions) Ability to work in a fast-paced environment with large-scale datasets Preferred: • Experience More ❯
managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Aerospace Corporation
teams toward software development best practices Experience in SQL, NoSQL, Cypher and other big data querying languages Experience with big data frameworks (Hadoop, Spark, Flink etc.) Experience with ML lifecycle management tools (MLflow, Kubeflow, etc.) Familiarity with data pipelining and streaming technologies (Apache Kafka, Apache Nifi, etc.) Demonstrated contributions More ❯
managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Ability to optimise and More ❯
efficient integration into Feast feature store. Requirements Good knowledge of programming languages such as Python or Java. Strong experience with streaming technologies (Spark, PySpark, Flink, KSQL or similar) for developing data transformation pipelines. Solid understanding and practical experience with SQL and relational databases (PostgreSQL preferred). Proficiency with AWS More ❯
OAuth, JWT, and data encryption. Fluent in English with strong communication and collaboration skills. Preferred Qualifications Experience with big data processing frameworks like ApacheFlink or Spark. Familiarity with machine learning models and AI-driven analytics. Understanding of front-end and mobile app interactions with backend services. Expertise in More ❯
london, south east england, United Kingdom Hybrid / WFH Options
eTeam
OAuth, JWT, and data encryption. Fluent in English with strong communication and collaboration skills. Preferred Qualifications Experience with big data processing frameworks like ApacheFlink or Spark. Familiarity with machine learning models and AI-driven analytics. Understanding of front-end and mobile app interactions with backend services. Expertise in More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
the Risk Management Framework (RMF) process Proficiency in system design and meticulous documentation Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs. Experience in More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
volume data/low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines Integration Skills REST and Graph APIs (Desirable) Serverless More ❯
Columbia, Maryland, United States Hybrid / WFH Options
HII Mission Technologies
related field; or High School Diploma or equivalent and 9 years relevant experience. Experience in streaming and/or batch analytics (e.g. Kafka, Spark, Flink, Storm, MapReduce, Hadoop) Experience in distributed databases, NoSQL databases, full text-search engines (e.g. Elasticsearch, MongoDB, Solr) Experience in designing enterprise APIs Experience in More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Yelp USA
services that power features like Review Insights, Business Summaries, and Conversational Search. Build real-time and batch data processing pipelines using technologies like Kafka, Flink, and Spark. Ensure high availability, performance, and reliability of backend systems in a production environment. Contribute to the team's engineering best practices and More ❯
Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean, efficient More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Solytics Partners
Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean, efficient More ❯
AND EXPERIENCE The successful Senior Data Platform Engineer will have the following skills and experience: Java Spring Boot Cloud (ideally AWS) Kafka (Kenisis or Flink will be considered) BENEFITS The successful Senior Data Platform Engineer will receive the following benefits: Salary up to £85,000 – depending on experience Semi More ❯
Grow with us. We are looking for a Machine Learning Engineer to work along the end-to-end ML lifecycle, alongside our existing Product & Engineering team. About Trudenty: The Trudenty Trust Network provides personalised consumer fraud risk intelligence for fraud More ❯
insurance industry. We are developing a modern real-time ML platform using technologies like Python, PyTorch, Ray, k8s (helm + flux), Terraform, Postgres and Flink on AWS. We are very big fans of Infrastructure-as-Code and enjoy Agile practices. As a team, we're driven by a relentless … CI/CD pipelines for efficient software delivery. Nice to have: Coding skills in Python Knowledge of other areas of our tech stack (GitLab, Flink, Helm, FluxCD etc.) Knowledge of enterprise security best practices Proven experience in leading successful technical projects with an infrastructure/platform focus. Ability to More ❯