Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with ApacheNiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Gemba Advantage
and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. More ❯
is required. Application Development and Deployment: Utilize NodeJS, ReactJS, and RESTful APIs to develop and deploy scalable applications. Ensure seamless integration with Elasticsearch and Apache NiFi. CI/CD Pipelines: Design and implement CI/CD pipelines using Jenkins, Kubernetes, and OpenShift. Automate testing, building, and deployment processes to … for orchestration. Solid understanding of CI/CD principles and experience with Jenkins. Strong scripting skills (Bash, Python, Ruby). Familiarity with Elasticsearch and ApacheNiFi is desirable. Desirable Certifications AWS Certifications (preferably AWS Certified DevOps Engineer - Professional or higher). Red Hat Certified Engineer (RHCE) or equivalent More ❯
CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: ApacheNiFi Experience across multiple areas is desirable; we don't expect you to know everything but a willingness to learn and contribute across More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
MongoDB, and React Familiarity with cloud platforms (especially AWS), microservices, and containerisation DV clearance (or eligibility to obtain it) Nice to Have: Experience with ApacheNiFi, JSF, Hibernate, Elasticsearch, Kibana, or AWS services like EC2, Lambda, EKS CI/CD pipeline expertise using GitLab Knowledge of secure, scalable More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
Babcock
with frameworks like JUnit or Jest and using Git for version control. Experience with CI/CD tools like GitLab , Jenkins , Concourse , or even ApacheNiFi , Elasticsearch , or Kibana . Agile development experience ( SCRUM , Kanban ). Building software for the cloud (we're especially keen if you know More ❯
to obtain DV level security clearance *** Responsibilities : Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience More ❯
to obtain DV level security clearance *** Responsibilities : Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of ApacheNiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2 More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, ApacheNiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting More ❯
stack Ability to pro-actively manage own tasking and react to higher priority tasks Experience of Linux server administration Experience in technologies such as ApacheNiFi, MinIO/AWS S3 Experience managing and patching Java and Python applications Experience with containerisation technologies such as Docker/Podman and More ❯
Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would … for Data Devops Engineer: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. As a Cloud Technical Architect, you will be responsible for: Support the design More ❯
Splunk, etc.). Strong and demonstrable experience writing regular expressions and/or JSON parsing, etc. Strong experience in log processing (Cribl, Splunk, Elastic, ApacheNiFi etc.). Expertise in the production of dashboard/insight delivery. Be able to demonstrate a reasonable level of security awareness (An More ❯
and a commitment to delivering high-quality solutions. -Familiarity with Hyper-converged Infrastructure. Desired Competencies: -Knowledge of cross-domain technologies (Tiger Traps, Garrison, OpsWat, NiFi). -Familiarity with DevOps tools (GitLab, Harbor, FluxCD, Kubernetes-based containerisation). -Experience with Elastic for logging and analytics. This is an exciting opportunity More ❯
Extract, Load, Transform (ELT) workflows to move data from source systems to data stores. You will have used one or more supporting technologies, i.e. Apache, Kafka, NiFi, Spark, Flink, or Airflow. A history of working with SQL and NoSQL type databases (PostgreSQL, Mongo, ElasticSearch, Accumulo, or Neo4j). More ❯
Microsoft, Security). -Strong communication and teamwork skills. -Familiarity with Hyper-converged Infrastructure. Desired Competencies: -Understanding of cross-domain technologies (Tiger Traps, Garrison, OpsWat, NiFi). -Exposure to DevOps tools (GitLab, Harbor, FluxCD, Kubernetes). -Familiarity with Elastic for monitoring and analytics. This is an exciting opportunity to work More ❯
Microsoft, Security). Strong communication and teamwork skills. Familiarity with Hyper-converged Infrastructure. Desired Competencies Understanding of cross-domain technologies (Tiger Traps, Garrison, OpsWat, NiFi). Exposure to DevOps tools (GitLab, Harbor, FluxCD, Kubernetes). Familiarity with Elastic for monitoring and analytics. This is an exciting opportunity to work More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ … or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. To apply for this DV Cleared DevOps Engineer job, please click the button below and submit your latest CV. Curo Services endeavour to respond to all applications. However, this may More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE Seniority level Seniority level Not Applicable Employment More ❯
InfoBlox experience is highly desirable). - Excellent problem-solving, analytical, and communication skills. Desired Competencies: -Understanding of cross-domain technologies (Tiger Traps, Garrison, OpsWat, NiFi). -Familiarity with VMware infrastructure technologies. -Experience using Elastic for monitoring and analytics. This is an exciting opportunity to work with cutting-edge network More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins, Jira, Confluence and Github are used as support tools. We use ansible to … using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of data would be beneficial You will be using Ansible to manage configuration and deployments Salary And Benefits Salary More ❯