maintaining data pipelines. Proficiency in JVM-based languages (Java, Kotlin), ideally combined with Python and experience in Spring Boot Solid understanding of data engineering tools and frameworks, like Spark, Flink, Kafka, dbt, Trino, and Airflow. Hands-on experience with cloud environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL and More ❯
e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc Good knowledge of stream and batch processing solutions like ApacheFlink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic etc Given that this is just a short snapshot of the role More ❯
Luton, England, United Kingdom Hybrid / WFH Options
easyJet
CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations More ❯
bedford, east anglia, united kingdom Hybrid / WFH Options
easyJet
CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
easyJet
CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations More ❯
watford, hertfordshire, east anglia, united kingdom Hybrid / WFH Options
easyJet
CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and More ❯
bristol, south west england, united kingdom Hybrid / WFH Options
Lloyds Banking Group
solutions in an Agile environment. Technical Proficiency: Deep technical expertise in software and data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
london (city of london), south east england, united kingdom
Infosys
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
observability frameworks, including lineage tracking, SLAs, and data quality monitoring. Familiarity with modern data lake table formats such as Delta Lake, Iceberg, or Hudi. Background in stream processing (Kafka, Flink, or similar ecosystems). Exposure to containerisation and orchestration technologies such as Docker and Kubernetes. More ❯
observability frameworks, including lineage tracking, SLAs, and data quality monitoring. Familiarity with modern data lake table formats such as Delta Lake, Iceberg, or Hudi. Background in stream processing (Kafka, Flink, or similar ecosystems). Exposure to containerisation and orchestration technologies such as Docker and Kubernetes. More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and observability stacks (lineage, data contracts, quality monitoring). Knowledge of data lake formats (Delta Lake, Parquet, Iceberg, Hudi). Familiarity with containerisation and streaming technologies (Docker, Kubernetes, Kafka, Flink). Exposure to lakehouse or medallion architectures within Databricks. More ❯
Nice to haves: Experience with NoSQL databases (MongoDB, Cassandra, Redis). Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, ApacheFlink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Experience working with AI-driven development tools such as Cursor, Copilot, or Replit More ❯
knowledge Redshift and Snowflake preferred Working with IaC - Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, ApacheFlink Experience working with a ETL environments Experience working with a confluent cloud platform Disclaimer: This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Bondaval
or similar) from a good University highly desirable. Nice to Have: Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, ApacheFlink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Understanding of infrastructure and DevOps (Terraform, Ansible, AWS, Kubernetes). Exposure to fintech More ❯
london, south east england, united kingdom Hybrid / WFH Options
Bondaval
or similar) from a good University highly desirable. Nice to Have: Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, ApacheFlink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Understanding of infrastructure and DevOps (Terraform, Ansible, AWS, Kubernetes). Exposure to fintech More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Bondaval
or similar) from a good University highly desirable. Nice to Have: Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, ApacheFlink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Understanding of infrastructure and DevOps (Terraform, Ansible, AWS, Kubernetes). Exposure to fintech More ❯
london, south east england, united kingdom Hybrid / WFH Options
BondAval
on when needed. Nice to haves Experience leading technical discovery or architecture definition in a scaling SaaS or fintech environment. Familiarity with event-driven or streaming architectures (Kafka, ApacheFlink, etc.). Practical exposure to AI/LLM orchestration frameworks or fine-tuning workflows. Experience designing developer tools, data platforms, or intelligent systems. Interest in or experience mentoring engineers More ❯
for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world applications, or some exposure to More ❯
for high availability and reliability What We’re Looking For Experience 4+ years in a senior technical role, ideally within data engineering AWS stack: S3, EMR, EC2, Kinesis, Firehose, Flink/MSF Python & SQL Security and risk management frameworks Excellent communication and collaboration skills Desirable Event streaming and event streaming analytics in real-world applications, or some exposure to More ❯