non-technical stakeholders. A strong grasp of Agile practices. Familiarity with NoSQL databases, big data technologies, and integrating diverse data formats (e.g., JSON, Parquet, Avro). Experience with data cataloguing, metadata management, and data lineage tools. Strong troubleshooting and problem-solving skills for complex data challenges. Desirable Experience & Knowledge More ❯
datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation. More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
QAD Inc
patterns, such as, “expand and contract” using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Good collaboration skills at all levels with cross-functional teams Highly More ❯
patterns, such as, "expand and contract" using go-migrate Writing observable and testable code using libraries such as testify and mockgen Publishing and consuming Avro formatted Kafka messages CI/CD GitHub Actions Trunk Based Development & Continuous Delivery Soft skills: Good collaboration skills at all levels with cross-functional More ❯
experience with SQL, Data Pipelines, Data Orchestration and Integration Tools Experience in data platforms on premises/cloud using technologies such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop … Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new technologies, methodologies, and skills As the successful Data Engineering Manager you will be responsible for: Building More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience Background More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
london (city of london), south east england, united kingdom
Pixelated People
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
london (city of london), south east england, united kingdom
Pixelated People
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
ecosystem, including modules for Security, Web, and Data Expertise in building event-driven systems that deliver high reliability and performance Familiarity with message schemas (Avro, Protobuf) and strategies for evolving events Experience integrating with identity systems like Keycloak Strong grasp of system-level concerns like observability, availability, and resilience More ❯
to login/join with: We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing. … using tools like Ansible, Terraform, or Cloudera Manager. -Provide technical leadership and mentorship to junior team members. Required Skills: -Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams). -Experience with Cloudera distribution for Kafka on on-premises environments. -Solid knowledge of Kafka internals – topics, partitions … consumer groups, offset management, etc. -Experience with data serialization formats like Avro, JSON, Protobuf. -Proficient in Java, Scala, or Python for Kafka-based development. -Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.). -Understanding of networking, security (SSL/SASL), and data governance. -Experience with CI/ More ❯
Duration: 06+ Months Job Summary: We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing. … using tools like Ansible, Terraform, or Cloudera Manager. -Provide technical leadership and mentorship to junior team members. Required Skills: -Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams). -Experience with Cloudera distribution for Kafka on on-premises environments. -Solid knowledge of Kafka internals – topics, partitions … consumer groups, offset management, etc. -Experience with data serialization formats like Avro, JSON, Protobuf. -Proficient in Java, Scala, or Python for Kafka-based development. -Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.). -Understanding of networking, security (SSL/SASL), and data governance. -Experience with CI/ More ❯
Duration: 06+ Months Job Summary: We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing. … using tools like Ansible, Terraform, or Cloudera Manager. -Provide technical leadership and mentorship to junior team members. Required Skills: -Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams). -Experience with Cloudera distribution for Kafka on on-premises environments. -Proficiency in designing high-volume, low-latency … data pipelines. -Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc. -Experience with data serialization formats like Avro, JSON, Protobuf. -Proficient in Java, Scala, or Python for Kafka-based development. -Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.). -Understanding of networking, security (SSL More ❯
bradford, yorkshire and the humber, united kingdom
Vallum Associates
Duration: 06+ Months Job Summary: We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing. … using tools like Ansible, Terraform, or Cloudera Manager. -Provide technical leadership and mentorship to junior team members. Required Skills: -Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams). -Experience with Cloudera distribution for Kafka on on-premises environments. -Proficiency in designing high-volume, low-latency … data pipelines. -Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc. -Experience with data serialization formats like Avro, JSON, Protobuf. -Proficient in Java, Scala, or Python for Kafka-based development. -Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.). -Understanding of networking, security (SSL More ❯
deployment andconfiguration using tools like Ansible, Terraform, or ClouderaManager. -Provide technical leadership and mentorship to juniorteam members. Required Skills: -Strong hands-on experience with Apache Kafka (including KafkaConnect, Kafka Streams). -Experience with Cloudera distributionfor Kafka on on-premises environments. -Proficiency indesigning high-volume, low-latency data pipelines. -Solidknowledge … of Kafka internals – topics, partitions, consumer groups,offset management, etc. -Experience with data serializationformats like Avro, JSON, Protobuf. -Proficient in Java, Scala,or Python for Kafka-based development. -Familiarity withmonitoring tools (Prometheus, Grafana, Confluent Control Center,etc.). -Understanding of networking, security (SSL/SASL), anddata governance. -Experience with More ❯
Salford, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data … into a common data format using Avro Schemas and JOLT Designing a proxy in-front of our Kubernetes cluster egress to whitelist traffic and mitigate any security risks Implementing Access/Role Based Access Control in ElasticSearch Writing a React UI using an ATDD approach with Cypress Right now More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ
worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data … into a common data format using Avro Schemas and JOLT Designing a proxy in-front of our Kubernetes cluster egress to whitelist traffic and mitigate any security risks Implementing Access/Role Based Access Control in ElasticSearch Writing a React UI using an ATDD approach with Cypress Right now More ❯
Science, Data Engineering, or a related technical discipline 5+ years of hands-on experience building large-scale, production-grade data pipelines Strong knowledge of Apache Beam, Dataflow, Pub/Sub, or similar streaming and batch processing technologies Skilled in Python, Java, or Scala. Experience with SQL, dbt, or BigQuery … is a plus Solid foundation in data modeling, schema design, and data lifecycle management Comfortable working with a variety of data formats, including JSON, Avro, and Protobuf Strong analytical thinking, attention to detail, and an ability to collaborate across engineering, ML, and product teams Nice to Have Background in More ❯
datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation. … Data Engineering – Proficiency in building and optimising data pipelines and working with large-scale datasets. File Formats & Storage – Hands-on experience with Parquet, ORC, Avro, and efficient S3 storage solutions. DevOps & Automation – Experience with Terraform, CloudFormation, or CDK to automate infrastructure deployment. Security & Compliance – Familiarity with AWS Security Lake More ❯