London, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheNiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies … such as Kafka, ApacheNiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and … Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with ApacheNiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like More ❯
Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: ApacheNiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning More ❯
quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn ApacheNiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in More ❯
quickly and apply new skills. Desirable: Solid understanding of microservices development. SQL and NoSQL databases working set. Familiar with or able to quickly learn ApacheNiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM. Good skills working with JSON, XML, YAML files. Knowledge in More ❯
Bath, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
Newport, Wales, United Kingdom Hybrid / WFH Options
JR United Kingdom
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
London, England, United Kingdom Hybrid / WFH Options
LHH
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
and implementing CI/CD pipelines. Required Technical Skills Proficiency with AWS, RHEL, Terraform, and Ansible Experience with NodeJS, ReactJS, RESTful APIs, Elasticsearch, and ApacheNiFi Design and implementation of CI/CD pipelines with Jenkins, Kubernetes, and OpenShift Containerization skills using Docker, Podman, and Buildah Additional Requirements … s degree or equivalent experience Experience as a DevOps Engineer focused on AWS Strong scripting skills (Bash, Python, Ruby) Desirable: AWS Certifications, RHCE, Elasticsearch, ApacheNiFi familiarity Job Details Seniority level: Mid-Senior level Employment type: Full-time Job function: Engineering and IT Industries: IT Services and Consulting More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
mentoring your team and shaping best practices. Role Lead Technical Execution & Delivery - Design, build, and optimise data pipelines and data infrastructure using Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other technologies. - Break down business requirements into technical solutions and delivery plans. - Lead technical decisions, ensuring alignment with data … You Technical & Engineering Skills - Extensive demonstrable experience in data engineering, with expertise in building scalable data pipelines and infrastructure. - Deep understanding of Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other data technologies. - Strong experience with ETL/ELT processes and data transformation. - Proficiency in SQL, NoSQL, and data More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please More ❯
London, England, United Kingdom Hybrid / WFH Options
Searchability
active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience required in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please More ❯
Dental, Vision, 401k with company matching, and life insurance. Rate: $80 - $86/hr W2 Responsibilities: Develop, optimize, and maintain data ingestion flows using Apache Kafka, ApacheNifi, and MySQL/PostgreSQL. Develop within AWS cloud services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena. Coordinate … learning techniques. Proficiency in programming languages such as Python, R, and Java. Experience in building modern data pipelines and ETL processes with tools like Apache Kafka and Apache Nifi. Proficiency in Java, Scala, or Python programming. Experience managing or testing API Gateway tools and Rest APIs. Knowledge of More ❯
clean, secure code following a test-driven approach. Monitor and maintain - Monitor data systems for performance issues and make any necessary updates. Required Skills Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java. Built on over a More ❯
London, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The Role Will Include Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with ApacheNiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
Object-oriented programming (Java) Data modeling using various database technologies ETL processes (transferring data in-memory, moving away from traditional ETLs) and experience with Apache Spark or ApacheNiFi Applied understanding of CI/CD in change management Dockerized applications Using distributed version control systems Being an More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
at: Object oriented programming (Java) Data modelling using any database technologies ETL processes (ETLs are oldschool, we transfer in memory now) and experience with Apache Spark or ApacheNiFi Applied understanding of CI\CD in change management Dockerised applications Used distributed version control systems Excellent team player More ❯
processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres … sql server) Data Integration Tools: Knowledge of platforms like Airflow, ApacheNiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience More ❯
Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational More ❯
London, England, United Kingdom Hybrid / WFH Options
Gemba Advantage
and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. More ❯