within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using ApacheKafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL/NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using ApacheKafka . Proficiency with Docker and Kubernetes in deploying containerized data workflows. Good understanding of cloud services (AWS or Azure). Hands-on experience with ELK stack (Elasticsearch, Logstash, Kibana More ❯
Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical … Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools Senior Delivery Consultant Office: 02033759240 Email: psharma@vallumassociates.com More ❯
within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using ApacheKafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL/NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using ApacheKafka . Proficiency with Docker and Kubernetes in deploying containerized data workflows. Good understanding of cloud services (AWS or Azure). Hands-on experience with ELK stack (Elasticsearch, Logstash, Kibana More ❯
London, England, United Kingdom Hybrid / WFH Options
VIOOH
Developing streaming pipelines and services for near real-time reporting, including streaming joins. Building high-throughput services to push events to third parties. Managing key infrastructure components such as Kafka, Aurora, DynamoDB, handling around 3 billion Kafka messages per day. Leading infrastructure migration and upgrades. Writing software using Java, Python, or Scala. Qualifications and Skills Essential: Ability to … design services with knowledge of distributed systems. Experience building REST APIs. Strong knowledge of streaming technologies, especially Kafka, both as a user and understanding its inner workings. Experience managing AWS or GCP cloud environments. Experience with monitoring tools such as Datadog, Kibana, Grafana, or Prometheus. Proficiency with Terraform, Docker, and Kubernetes. Software development experience in Java, Scala, or Python. More ❯
London, England, United Kingdom Hybrid / WFH Options
Paymentology
Policies Work with Google Anthos, Kubernetes, and container-based architectures Optimise GCP cost and performance, ensuring efficiency across environments (Optional) Provide expertise in AWS, Azure, and OCI when needed Kafka And Kubernetes Platform Management Design, deploy, and maintain scalable Kafka and Kubernetes clusters to support development and production environments Implement best practices for Kafka and Kubernetes operations … ensuring high availability, performance, and security Monitor, troubleshoot, and optimize Kafka and Kubernetes infrastructure to meet development team needs Implementation Implement cloud infrastructure components, including compute, storage, networking, and security services Ensure seamless integration with existing systems across multi-cloud platforms Migration And Deployment Assess and migrate on-premises applications to the Google Cloud Develop and execute cloud migration … Expertise: Deep hands-on experience in Google Cloud (GCP) Basic knowledge/willing to learn AWS Proven experience in designing and implementing cloud architectures and solutions Experience with ApacheKafka, including setup, configuration, monitoring, and troubleshooting Proficiency in Kubernetes (GKE), including experience with container orchestration, Helm charts, and Kubernetes operators Hands-on experience with DevOps tools (e.g., GitLab, Jenkins More ❯
s degree Nice If You Have: Experience with developing ETL and ELT pipelines with Apache Nifi and Databricks Ability to perform message queuing and real time streaming with ApacheKafka Ability to perform Scala programming Clearance: Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information ; Top Secret More ❯
experts and mentor junior team members. Leverage full-stack technologies including Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, DMN, BPMN, and Kubernetes. Utilize data-streaming technologies such as Kafka CDC, Kafka topics, EMS, and Apache Flink. Innovate and incubate new ideas. Work on a broad range of problems involving large data sets, real-time processing, messaging, workflows More ❯
experts and mentor junior team members. Leverage full-stack technologies including Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, DMN, BPMN, and Kubernetes. Utilize data-streaming technologies such as Kafka CDC, Kafka topics, EMS, and Apache Flink. Innovate and incubate new ideas. Work on a broad range of problems involving large data sets, real-time processing, messaging, workflows More ❯
Kubernetes/Helm Experience with SQL and stored procedures. Oracle database preferred Experience with the following technologies: Spring 3.x, JPA, REST/Web Services, Maven, XML, Apache Tomcat, Swagger, Kafka, Reactor/JavaRX Understanding of Angular framework and Typescript is nice-to-have. Experience building and deploying cloud enabled applications using 12 factor design. Experience building and deploying applications More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
complex data pipeline issues. Desirable: Experience working in government, defence, or highly regulated industries with knowledge of relevant standards. Experience with additional data processing and ETL tools like ApacheKafka, Spark, or Hadoop. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience with monitoring and alerting tools such as Prometheus, Grafana, or ELK for data infrastructure. More ❯
into secure, scalable infrastructure designs. Build and maintain Kubernetes clusters, Terraform IaC, CI/CD pipelines , and observability tooling (Prometheus, Grafana). Optimise real‐time data pipelines using ApacheKafka, Snowflake, and Postgres —ensuring low‐latency, high‐reliability ingestion from IoT sensors and EHR integrations. Collaborate with our Security & Compliance team to uphold SOC 2, ISO 27001, HIPAA/…/Infrastructure Engineering, delivering mission‐critical systems in production. Deep expertise with AWS or Azure , Kubernetes , Docker/OCI , and Terraform . Hands‐on experience with event‐streaming platforms (Kafka, Kinesis) and relational/analytical databases. Demonstrated customer‐facing skills—able to explain complex infrastructure decisions to clinical and IT stakeholders. UK work authorisation and willingness to work hybrid More ❯
London, England, United Kingdom Hybrid / WFH Options
Circadia Health
into secure, scalable infrastructure designs. Build and maintain Kubernetes clusters, Terraform IaC, CI/CD pipelines , and observability tooling (Prometheus, Grafana). Optimise real‐time data pipelines using ApacheKafka, Snowflake, and Postgres —ensuring low‐latency, high‐reliability ingestion from IoT sensors and EHR integrations. Collaborate with our Security & Compliance team to uphold SOC 2, ISO 27001, HIPAA/…/Infrastructure Engineering, delivering mission‐critical systems in production. Deep expertise with AWS or Azure , Kubernetes , Docker/OCI , and Terraform . Hands‐on experience with event‐streaming platforms (Kafka, Kinesis) and relational/analytical databases. Demonstrated customer‐facing skills—able to explain complex infrastructure decisions to clinical and IT stakeholders. UK work authorisation and willingness to work hybrid More ❯
to-end solutions, learn from experts, leverage full-stack technologies including Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, DMN, BPMN, and Kubernetes, utilize data-streaming technologies such as Kafka CDC, Kafka topics, EMS, and Apache Flink, innovate and incubate new ideas, work on a broad range of problems, often involving large data sets, including real-time processing More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
complex data pipeline issues Desirable Experience working in government, defence, or highly regulated industries with knowledge of relevant standards Experience with additional data processing and ETL tools like ApacheKafka, Spark, or Hadoop Familiarity with containerization and orchestration tools such as Docker and Kubernetes Experience with monitoring and alerting tools such as Prometheus, Grafana, or ELK for data infrastructure More ❯
SAR phenomenology Experience with Maven, Rancher, Sonarqube Experience with GIT, Ruby, ArgoCD Experience with Terraform AWS Cloud Computing Skills, AWS CloudWatch, Elastic Search, Prometheus, Grafana Familiarity with NIFI, Apache, Kafka Kalman Filtering Education/Experience Requirements SME Level: Expert consultant to top management typically with an advanced degree and 13+ years' experience or a bachelor's with 15+ years More ❯
and mentor and coach the junior members, leverage full-stack technologies including; Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, DMN, BPMN and Kubernetes leverage data-streaming technologies including Kafka CDC, Kafka topic and related technologies, EMS, Apache Flink be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems More ❯
technologies: Angular, TypeScript, HTML, CSS, JavaScript. Experience in developing and deploying microservices architectures using Spring Boot, Spring Batch, Spring Data or similar frameworks. Knowledge of JIRA, Junit and ApacheKafka are preferred. Hands-on experience with NoSQL DB (CosmosDB, MongoDB). In-depth knowledge of REST API principles, HTTP protocol. Familiarity with containerization tools and cloud platforms. Solid grasp More ❯
technologies: Angular, TypeScript, HTML, CSS, JavaScript. Experience in developing and deploying microservices architectures using Spring Boot, Spring Batch, Spring Data or similar frameworks. Knowledge of JIRA, Junit and ApacheKafka are preferred. Hands-on experience with NoSQL DB (CosmosDB, MongoDB). In-depth knowledge of REST API principles, HTTP protocol. Familiarity with containerization tools and cloud platforms. Solid grasp More ❯
hands-on background in designing, building, and deploying microservices (ideally with Java, Spring Boot, Node.js, or Python). Expertise designing and implementing event-driven systems using tools like ApacheKafka, RabbitMQ, Amazon EventBridge, Azure Event Grid, or Google Pub/Sub. Cloud Hyperscalers: Practical experience with AWS, Azure, and GCP. Leading iPaaS Tools: Proficiency in Boomi, SnapLogic, Workato, or More ❯
requirements. Beneficial to have: Bachelor's degree in Cybersecurity, Computer Science, Information Systems, Mathematics, Engineering or a related field. Industry related certifications. Experience with data processing technologies like ApacheKafka and Elasticsearch. Experience with multi-cloud or hybrid cloud architectures. Where it's done: Remote (Herndon, VA). More ❯
orchestration technologies (e.g., Docker, Kubernetes) Monitoring and Logging: Experience with monitoring and logging tools like DataDog, Prometheus, or Grafana Data Engineering Skills: Knowledge of event streaming platforms (e.g., ApacheKafka) and SQL database management Strong Communication and Collaboration: Excellent communication skills and the ability to work effectively in a remote, collaborative environment Nice to Haves: Experience with infrastructure-as More ❯
Java 2+ years of experience in deploying and integrating containerized software applications using container orchestration platforms, including Kubernetes 2+ years of experience implementing event-driven or streaming architectures leveraging Kafka, Amazon SNS, RedPanda, or Apache Flink 2+ years of experience with running, troubleshooting, and debugging applications on Linux systems 1+ year of experience with building or maintaining production-grade More ❯
you have what it takes? Skills These skills are essential to be successful in this role Hands-on programming experience - Go, experience with Rust is beneficial Technologies & Tools - ApacheKafka, Kubernetes, Docker, GitHub Azure and AWS or GCP and data processing in cloud Database and SQL development experience, especially PostGreSQL Collaborative team player with strong (written and verbal) communication More ❯
/s Strong data visualizations skills to convey information and results clearly Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Experience with event messaging frameworks like ApacheKafka The hiring range for this position in Santa Monica, California is $136,038 to $182,490 per year, in Glendale, California is $136,038 to $182,490 per year More ❯
London, England, United Kingdom Hybrid / WFH Options
Applicable Limited
roles, delivering custom data architecture solutions across various industries. Architectural Expertise: Strong expertise in designing and overseeing delivery of data streaming and event-driven architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh More ❯