Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
exciting, and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK-based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with ApacheNiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control Familiarity with data More ❯
systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data Devops … Engineer: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. As a Cloud Technical Architect, you will be responsible for: Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science More ❯
bath, south west england, united kingdom Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science More ❯
bradley stoke, south west england, united kingdom Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology … e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. To apply for this DV Cleared DevOps Engineer job, please click the button below and submit your latest CV. Curo Services endeavour to respond to all applications. However, this may not always be possible during periods of high volume. Thank you for your patience. Curo More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Gemba Advantage
such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. relational databases, Elasticsearch, Mongo Why More ❯
and Ansible for infrastructure automation is required. Application Development and Deployment: Utilize NodeJS, ReactJS, and RESTful APIs to develop and deploy scalable applications. Ensure seamless integration with Elasticsearch and Apache NiFi. CI/CD Pipelines: Design and implement CI/CD pipelines using Jenkins, Kubernetes, and OpenShift. Automate testing, building, and deployment processes to ensure efficiency and quality. Containerization … Proficiency in Kubernetes and OpenShift for orchestration. Solid understanding of CI/CD principles and experience with Jenkins. Strong scripting skills (Bash, Python, Ruby). Familiarity with Elasticsearch and ApacheNiFi is desirable. Desirable Certifications AWS Certifications (preferably AWS Certified DevOps Engineer - Professional or higher). Red Hat Certified Engineer (RHCE) or equivalent certification. Excellent problem-solving abilities More ❯
such as GitLab CI , Terraform , Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for secure secrets management and token … Strong understanding of Kubernetes design, configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp Vault for managing secrets, certificates More ❯
such as GitLab CI , Terraform , Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for secure secrets management and token … Strong understanding of Kubernetes design, configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp Vault for managing secrets, certificates More ❯
such as GitLab CI , Terraform , Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for secure secrets management and token … Strong understanding of Kubernetes design, configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp Vault for managing secrets, certificates More ❯
Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to move data from source systems to data stores. You will have used one or more supporting technologies, i.e. Apache, Kafka, NiFi, Spark, Flink, or Airflow. A history of working with SQL and NoSQL type databases (PostgreSQL, Mongo, ElasticSearch, Accumulo, or Neo4j). You will be able to More ❯
Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: ApacheNiFi Experience across multiple areas is desirable; we don't expect you to know everything but a willingness to learn and contribute across the stack is key. #LI More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE Seniority level Seniority level Not Applicable Employment type Employment type Full-time More ❯
cheltenham, south west england, united kingdom Hybrid / WFH Options
Anson McCade
tools like JUnit, Git, Jira, MongoDB, and React Familiarity with cloud platforms (especially AWS), microservices, and containerisation DV clearance (or eligibility to obtain it) Nice to Have: Experience with ApacheNiFi, JSF, Hibernate, Elasticsearch, Kibana, or AWS services like EC2, Lambda, EKS CI/CD pipeline expertise using GitLab Knowledge of secure, scalable architectures for cloud deployments O.K. More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
tools like JUnit, Git, Jira, MongoDB, and React Familiarity with cloud platforms (especially AWS), microservices, and containerisation DV clearance (or eligibility to obtain it) Nice to Have: Experience with ApacheNiFi, JSF, Hibernate, Elasticsearch, Kibana, or AWS services like EC2, Lambda, EKS CI/CD pipeline expertise using GitLab Knowledge of secure, scalable architectures for cloud deployments O.K. More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
gloucester, south west england, united kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
cheltenham, south west england, united kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
Babcock
MongoDB . Building and testing with frameworks like JUnit or Jest and using Git for version control. Experience with CI/CD tools like GitLab , Jenkins , Concourse , or even ApacheNiFi , Elasticsearch , or Kibana . Agile development experience ( SCRUM , Kanban ). Building software for the cloud (we're especially keen if you know AWS , or have worked with More ❯
must hold or be eligible to obtain DV level security clearance *** Responsibilities : Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience with monitoring and alerting tools More ❯
must hold or be eligible to obtain DV level security clearance *** Responsibilities : Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience with monitoring and alerting tools More ❯
pipelines Hands-on experience with Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of ApacheNiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS, Fargate, IAM, S3, Lambda More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins, Jira, Confluence and Github are used as support tools. We use ansible to manage configuration and deployments. Most … processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of data would be beneficial You will be using Ansible to manage configuration and deployments Salary And Benefits Salary DOE 25 days annual leave More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, ApacheNiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK's More ❯