Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
Azure and distributed systems. Preferred Skills Kubernetes & Helm: Deploying and managing containerized applications at scale with reliability and fault tolerance. Kafka (Confluent): Familiarity with event-driven architectures; experience with Flink or KSQL is a plus. Airflow: Experience configuring, maintaining, and optimizing DAGs. Energy or commodity trading: Understanding the data challenges and workflows in this sector. Trading domain knowledge: Awareness More ❯
Azure and distributed systems. Preferred Skills Kubernetes & Helm: Deploying and managing containerized applications at scale with reliability and fault tolerance. Kafka (Confluent): Familiarity with event-driven architectures; experience with Flink or KSQL is a plus. Airflow: Experience configuring, maintaining, and optimizing DAGs. Energy or commodity trading: Understanding the data challenges and workflows in this sector. Trading domain knowledge: Awareness More ❯
observability frameworks, including lineage tracking, SLAs, and data quality monitoring. Familiarity with modern data lake table formats such as Delta Lake, Iceberg, or Hudi. Background in stream processing (Kafka, Flink, or similar ecosystems). Exposure to containerisation and orchestration technologies such as Docker and Kubernetes. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
contract definition, clean code, CI/CD, path to production Worked with AWS as a cloud platform Extensive hands-on experience with modern data technologies, ETL tools (e.g. Kafka, Flink, DBT etc.) , data storage (e.g. Snowflake, Redshift, etc.) and also IaC ( e.g. Terraform, CloudFormation ) Software development experience with one or more languages (e.g. Python, Java, Scala, Go ) Pragmatic approach More ❯
City Of London, England, United Kingdom Hybrid/Remote Options
Bondaval
or similar) from a good University highly desirable. Nice to Have: Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, ApacheFlink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Experience with real-time streaming technologies (Kafka, Apache Storm). Understanding of infrastructure More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Advanced Resource Managers
knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, ApacheFlink Experience working with a ETL environments Experience working with a confluent cloud platform More ❯
knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, ApacheFlink Experience working with a ETL environments Experience working with a confluent cloud platform More ❯
knowledge Redshift and Snowflake preferred Working with IaC - Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, ApacheFlink Experience working with a ETL environments Experience working with a confluent cloud platform Disclaimer: This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT More ❯
for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world applications, or some exposure to More ❯
for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world applications, or some exposure to More ❯
Java, data structures and concurrency, rather than relying on frameworks such as Spring. You have built event-driven applications using Kafka and solutions with event-streaming frameworks at scale (Flink/Kafka Streams/Spark) that go beyond basic ETL pipelines. You know how to orchestrate the deployment of applications on Kubernetes, including defining services, deployments, stateful sets etc. More ❯
will be given preference: • Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, REST Proxy, Confluent Control Center • Hands-on with Confluent Cloud services, including ksqlDB Cloud and ApacheFlink • Familiarity with Stream Governance, Data Lineage, Stream Catalog, Audit Logs, RBAC • Confluent certifications (Developer, Administrator, or Flink Developer) • Experience with Confluent Platform, Confluent Cloud managed services, multi-cloud More ❯
will be given preference: • Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, REST Proxy, Confluent Control Center • Hands-on with Confluent Cloud services, including ksqlDB Cloud and ApacheFlink • Familiarity with Stream Governance, Data Lineage, Stream Catalog, Audit Logs, RBAC • Confluent certifications (Developer, Administrator, or Flink Developer) • Experience with Confluent Platform, Confluent Cloud managed services, multi-cloud More ❯
Greater London, England, United Kingdom Hybrid/Remote Options
Quant Capital
Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you will need More ❯
required in the role; we are happy to support your learning on the job, but prior experience is a plus: Experience with large-scale data processing frameworks (e.g., Spark, Flink). Experience with time series analysis, anomaly detection, or graph analytics in a security context. Proficiency in data visualization tools and techniques to effectively communicate complex findings. A basic More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
make sure the company’s AI models are learning, and improving, in production. 🧠 What You’ll Be Doing Architect and maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or Delta Lake) Automate and scale model training pipelines in close … dbt Champion data reliability, scalability, and performance across the platform 🧩 The Tech Environment You’ll likely be working with some combination of: Languages: Python, Scala, Go Streaming: Kafka/Flink/Spark Structured Streaming Workflow orchestration: Airflow/Prefect/Dagster Data storage & processing: Snowflake/Detabricks/BigQuery/Redshift Infrastructure: Docker/Kubernetes/Terraform/dbt More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
make sure the company’s AI models are learning, and improving, in production. 🧠 What You’ll Be Doing Architect and maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or Delta Lake) Automate and scale model training pipelines in close … dbt Champion data reliability, scalability, and performance across the platform 🧩 The Tech Environment You’ll likely be working with some combination of: Languages: Python, Scala, Go Streaming: Kafka/Flink/Spark Structured Streaming Workflow orchestration: Airflow/Prefect/Dagster Data storage & processing: Snowflake/Detabricks/BigQuery/Redshift Infrastructure: Docker/Kubernetes/Terraform/dbt More ❯
to cross-functional teams, ensuring best practices in data architecture, security and cloud computing Proficiency in data modelling, ETL processes, data warehousing, distributed systems and metadata systems Utilise ApacheFlink and other streaming technologies to build real-time data processing systems that handle large-scale, high-throughput data Ensure all data solutions comply with industry standards and government regulations … not limited to EC2, S3, RDS, Lambda and Redshift. Experience with other cloud providers (e.g., Azure, GCP) is a plus In-depth knowledge and hands-on experience with ApacheFlink for real-time data processing Proven experience in mentoring and managing teams, with a focus on developing talent and fostering a collaborative work environment Strong ability to engage with More ❯
at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise. Key responsibilities: Design and implement high-throughput data streaming solutions using Kafka, Flink, or Confluent. Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles. Develop tools and frameworks for data governance, privacy, and quality monitoring … data use cases. Contribute to an engineering culture that values testing, peer reviews, and automation-first principles. What You'll Bring Strong experience in streaming technologies such as Kafka, Flink, or Confluent. Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals. Proven ability to design, deploy, and scale production-grade data platforms and backend More ❯
Responsibilities: Develop and maintain high-performance, low-latency Java-based systems for front office trading or pricing platforms. Build reactive systems using Kafka Streams , Akka , Eclipse Vert.x , or ApacheFlink . Utilize multithreading , concurrency models , and Executor Services to optimize system performance and throughput. Write clean, efficient, and maintainable code using functional programming paradigms in Java. Follow and promote … of hands-on Java development experience, preferably in front office systems (e.g., trading platforms, pricing engines, market data systems). Proven expertise in reactive programming (Kafka Streams, Akka, Vert.x, Flink). Solid understanding of multithreading and Executor Services in Java. Strong background in functional programming and Java 8+ features. Adherence to robust engineering practices: SOLID principles , unit testing , TDD More ❯