Apache NiFi Jobs in the UK

1 to 25 of 97 Apache NiFi Jobs in the UK

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Leonardo
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The Role Will Include Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data More ❯
Posted:

Senior Data Engineer

West Bromwich, England, United Kingdom
Hybrid / WFH Options
Leonardo UK Ltd
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data More ❯
Posted:

Senior Data Engineer

Bristol, England, United Kingdom
Hybrid / WFH Options
Leonardo
exciting, and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK-based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with Apache NiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control Familiarity with data More ❯
Posted:

Head of Data & Analytics Architecture and AI

London, United Kingdom
pladis Foods Limited
Data Storage & Databases: SQL & NoSQL Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Cloud Technical Architect / Data DevOps Engineer

Bristol, United Kingdom
Hewlett Packard Enterprise Development LP
systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data Devops … Engineer: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. As a Cloud Technical Architect, you will be responsible for: Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer London, UK

London, United Kingdom
Galytix Limited
in at least one of the big 3 cloud ML stacks (AWS, Azure, GCP). Hands-on experience with open-source ETL, and data pipeline orchestration tools such as Apache Airflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration tools like Apache Airflow. Experience More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer ( DV Cleared )

Greater Bristol Area, United Kingdom
Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science More ❯
Posted:

Data Engineer ( DV Cleared )

Newport, Wales, United Kingdom
Hybrid / WFH Options
JR United Kingdom
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments Experience using Infrastructure as Code … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science More ❯
Posted:

Data Engineer ( DV Cleared )

London, England, United Kingdom
Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments Experience using Infrastructure as Code … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science More ❯
Posted:

Core Data Engineer

Edinburgh, Scotland, United Kingdom
Optima Partners
documentation for all solutions and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres, sql server) Data Integration Tools … Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of experience in data engineering or a similar role. Proven track record of More ❯
Posted:

Software Engineer

London, United Kingdom
Hybrid / WFH Options
Gemba Advantage
languages such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational databases, Elasticsearch, MongoDB Why join More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer

London, England, United Kingdom
Hybrid / WFH Options
Gemba Advantage
such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. relational databases, Elasticsearch, Mongo Why More ❯
Posted:

Senior DevOps Engineer (PA2025Q1JB012)

Basildon, Essex, United Kingdom
SS&C
as the ability to learn quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in Python, Java, awk, sed, Ansible More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, England, United Kingdom
Lumilinks Group Ltd
Please speak to us if you have ..... .....the following professional aspirations Skill Enhancement: Aspires to deepen technical expertise in data engineering practices, including mastering tools and technologies like Apache Spark, Kafka, cloud platforms (AWS, Azure, Google Cloud), and data warehousing solutions. Career Progression : Aims to advance to a senior data engineer or data architect role, with long-term … Redshift, Google BigQuery, Snowflake, or Azure Synapse Analytics, including data modelling and ETL processes. ETL Processes: Proficient in designing and implementing ETL (Extract, Transform, Load) processes using tools like Apache NiFi, Talend, or custom scripts. Familiarity with ELT (Extract, Load, Transform) processes is a plus. Big Data Technologies : Familiarity with big data frameworks such as Apache Hadoop … and Apache Spark, including experience with distributed computing and data processing. Cloud Platforms: Proficient in using cloud platforms (e.g., AWS, Google Cloud Platform, Microsoft Azure) for data storage, processing, and deployment of data solutions. Data Pipeline Orchestration : Experience with workflow orchestration tools such as Apache Airflow or Prefect to manage and schedule data pipelines. Data Modelling : Strong understanding More ❯
Posted:

Core Data Engineer Engineering

Edinburgh, United Kingdom
Optima Partners
documentation for all solutions and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres, sql server) Data Integration Tools … Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of experience in data engineering or a similar role. Proven track record of More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

DV Cleared DevOps Engineer

Bristol, Gloucestershire, United Kingdom
Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology … e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. To apply for this DV Cleared DevOps Engineer job, please click the button below and submit your latest CV. Curo Services endeavour to respond to all applications. However, this may not always be possible during periods of high volume. Thank you for your patience. Curo More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer

Cheltenham, Gloucestershire, United Kingdom
Hybrid / WFH Options
Gemba Advantage
such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. relational databases, Elasticsearch, Mongo Why More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

Bedford, England, United Kingdom
ZipRecruiter
the contract. Benefits include Medical, Dental, Vision, 401k with company matching, and life insurance. Rate: $80 - $86/hr W2 Responsibilities: Develop, optimize, and maintain data ingestion flows using Apache Kafka, Apache Nifi, and MySQL/PostgreSQL. Develop within AWS cloud services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena. Coordinate with data owners to ensure … analysis, data visualization, and machine learning techniques. Proficiency in programming languages such as Python, R, and Java. Experience in building modern data pipelines and ETL processes with tools like Apache Kafka and Apache Nifi. Proficiency in Java, Scala, or Python programming. Experience managing or testing API Gateway tools and Rest APIs. Knowledge of traditional databases like Oracle, MySQL More ❯
Posted:

Senior AWS DevOps Engineer

Cheltenham, England, United Kingdom
IBM
and Ansible for infrastructure automation is required. Application Development and Deployment: Utilize NodeJS, ReactJS, and RESTful APIs to develop and deploy scalable applications. Ensure seamless integration with Elasticsearch and Apache NiFi. CI/CD Pipelines: Design and implement CI/CD pipelines using Jenkins, Kubernetes, and OpenShift. Automate testing, building, and deployment processes to ensure efficiency and quality. Containerization … Proficiency in Kubernetes and OpenShift for orchestration. Solid understanding of CI/CD principles and experience with Jenkins. Strong scripting skills (Bash, Python, Ruby). Familiarity with Elasticsearch and Apache NiFi is desirable. Desirable Certifications AWS Certifications (preferably AWS Certified DevOps Engineer - Professional or higher). Red Hat Certified Engineer (RHCE) or equivalent certification. Excellent problem-solving abilities More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Methods
Utilise Docker for containerisation and Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as Apache NiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka, Apache NiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and enhance Elasticsearch setups for robust data searching and analytics capabilities in mixed infrastructure settings. … services. Containerisation and Orchestration Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with Apache NiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka, Apache NiFi More ❯
Posted:

Senior AWS DevOps Engineer

London, England, United Kingdom
IBM
and RHEL, developing scalable applications, and implementing CI/CD pipelines. Required Technical Skills Proficiency with AWS, RHEL, Terraform, and Ansible Experience with NodeJS, ReactJS, RESTful APIs, Elasticsearch, and Apache NiFi Design and implementation of CI/CD pipelines with Jenkins, Kubernetes, and OpenShift Containerization skills using Docker, Podman, and Buildah Additional Requirements Eligibility to work in the … work and security clearance Bachelor's degree or equivalent experience Experience as a DevOps Engineer focused on AWS Strong scripting skills (Bash, Python, Ruby) Desirable: AWS Certifications, RHCE, Elasticsearch, Apache NiFi familiarity Job Details Seniority level: Mid-Senior level Employment type: Full-time Job function: Engineering and IT Industries: IT Services and Consulting #J-18808-Ljbffr More ❯
Posted:

DevOps Engineer DV Cleared

Greater Bristol Area, United Kingdom
LHH
such as GitLab CI , Terraform , Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as Apache NiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for secure secrets management and token … Strong understanding of Kubernetes design, configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting Apache NiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp Vault for managing secrets, certificates More ❯
Posted:

DevOps Engineer DV Cleared

Bath, England, United Kingdom
JR United Kingdom
such as GitLab CI , Terraform , Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as Apache NiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for secure secrets management and token … Strong understanding of Kubernetes design, configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting Apache NiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp Vault for managing secrets, certificates More ❯
Posted:

Big Data Engineer - DV Cleared

London, England, United Kingdom
Babcock
integration of software IT WOULD BE NICE FOR THE BIG DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ Experience of DevSecOps automated deployment tools such as Jenkins, Ansible, Docker TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to … CLEARED/DV CLEARANCE/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH #J-18808-Ljbffr More ❯
Posted:

Data Engineer

Gloucester, England, United Kingdom
Roke Manor Research Limited
Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to move data from source systems to data stores. You will have used one or more supporting technologies, i.e. Apache, Kafka, NiFi, Spark, Flink, or Airflow. A history of working with SQL and NoSQL type databases (PostgreSQL, Mongo, ElasticSearch, Accumulo, or Neo4j). You will be able to More ❯
Posted:
Apache NiFi
10th Percentile
£46,250
25th Percentile
£50,938
Median
£66,250
75th Percentile
£73,250
90th Percentile
£74,825