London, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheNiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies … such as Kafka, ApacheNiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and … Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with ApacheNiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like More ❯
reliable and secure network traffic distribution. •Troubleshoot and resolve network-related issues promptly. Collaborate with network teams to ensure optimal network performance and security. ApacheNiFi Management: •Manage and maintain ApacheNiFi data flow management systems. •Ensure the reliability, performance, and security of data flows within … systems. •Strong understanding of container orchestration platforms (K8S, K3S, EKS, AKS, GKE). •Experience with network management, particularly using F5 load balancers. •Familiarity with ApacheNifi and data flow management. •Strong understanding of networking, storage and virtualisation technologies. You will hold active HMG HLC Clearance and must be More ❯
Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: ApacheNiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning More ❯
Hounslow, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
London, England, United Kingdom Hybrid / WFH Options
LHH
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
and implementing CI/CD pipelines. Required Technical Skills Proficiency with AWS, RHEL, Terraform, and Ansible Experience with NodeJS, ReactJS, RESTful APIs, Elasticsearch, and ApacheNiFi Design and implementation of CI/CD pipelines with Jenkins, Kubernetes, and OpenShift Containerization skills using Docker, Podman, and Buildah Additional Requirements … s degree or equivalent experience Experience as a DevOps Engineer focused on AWS Strong scripting skills (Bash, Python, Ruby) Desirable: AWS Certifications, RHCE, Elasticsearch, ApacheNiFi familiarity Job Details Seniority level: Mid-Senior level Employment type: Full-time Job function: Engineering and IT Industries: IT Services and Consulting More ❯
mentoring your team and shaping best practices. Role Lead Technical Execution & Delivery - Design, build, and optimise data pipelines and data infrastructure using Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other technologies. - Break down business requirements into technical solutions and delivery plans. - Lead technical decisions, ensuring alignment with data … You Technical & Engineering Skills - Extensive demonstrable experience in data engineering, with expertise in building scalable data pipelines and infrastructure. - Deep understanding of Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other data technologies. - Strong experience with ETL/ELT processes and data transformation. - Proficiency in SQL, NoSQL, and data More ❯
London, England, United Kingdom Hybrid / WFH Options
Searchability
active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience required in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please More ❯
London, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The Role Will Include Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational More ❯
London, England, United Kingdom Hybrid / WFH Options
Gemba Advantage
and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. More ❯
BE NICE FOR THE BIG DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as ApacheNifi/RabbitMQ Experience of DevSecOps automated deployment tools such as Jenkins, Ansible, Docker TO BE CONSIDERED.... Please either apply by clicking online … DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH #J-18808-Ljbffr More ❯
Spark, Kafka). Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). Expertise in ETL tools and processes (e.g., ApacheNiFi, Talend, Informatica). Proficiency in data integration tools and technologies. Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) is More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet More ❯
london (city of london), south east england, united kingdom
Anson McCade
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet More ❯
British-born sole UK National with active SC or DV Clearance Strong Java skills, familiarity with Python Experience in Linux, Git, CI/CD, ApacheNiFi Knowledge of Oracle, MongoDB, React, Elasticsearch Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) If you do not meet all requirements still More ❯
and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend, Informatica, ApacheNiFi). Knowledge of programming languages such as SQL, Python, or Java. Experience with BI tools (e.g., Power BI, Tableau) and data visualisation More ❯
WOULD BE NICE FOR THE DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as ApacheNifi/RabbitMQ TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to dominic.barbet@searchability.comFor further information please call … DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER #J-18808-Ljbffr More ❯
NICE FOR THE LEAD BIG DATA ENGINEER TO HAVE... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as ApacheNifi/RabbitMQ TO BE CONSIDERED... Please either apply by clicking online or emailing me directly at dominic.barbet@searchability.com. For further information, please More ❯