London, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheNiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies … such as Kafka, ApacheNiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and … Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with ApacheNiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like More ❯
Responsibilities: Develop, optimize, and maintain data ingest flows using Apache Kafka, ApacheNifi and MySQL/PostGreSQL Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena Communicate with data owners to set up and ensure configuration … of programming languages like Python, R, and Java Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and ApacheNifi Proficient in programming languages like Java, Scala, or Python Experience or expertise using, managing, and/or testing More ❯
Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: ApacheNiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning More ❯
quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn ApacheNiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in More ❯
database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, ApacheNifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed … queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD … Data Governance concepts and experience. (Preferred) • Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with ApacheNifi and other ETL tools. (Preferred) • Demonstrated experience with Apache Spark. (Preferred) Other Job Requirements: • Active Top Secret/SCI w/ More ❯
Newport, Wales, United Kingdom Hybrid / WFH Options
JR United Kingdom
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
London, England, United Kingdom Hybrid / WFH Options
LHH
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with ApacheNiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with ApacheNiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine More ❯
design patterns to ensure the product's scalability, maintainability, and long-term success. Understand API-driven microservice design patterns, NoSQL databases, data ingest tools (ApacheNiFi), and modern web frameworks. Maintain a team player mentality as a collaborative member of a fast-paced, structured 10-14 person team. … team environment Desired Skills: NoSQL DBs (Mongo, ElasticSearch, Redis, Graph DB, etc.) Data wrangling (Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating) with ApacheNiFi or similar tools CI/CD (e.g., Jenkins), Junit testing or similar DevOps experience (Packer, Terraform, Ansible) Containerization experience (Docker, Kubernetes, etc. More ❯
and implementing CI/CD pipelines. Required Technical Skills Proficiency with AWS, RHEL, Terraform, and Ansible Experience with NodeJS, ReactJS, RESTful APIs, Elasticsearch, and ApacheNiFi Design and implementation of CI/CD pipelines with Jenkins, Kubernetes, and OpenShift Containerization skills using Docker, Podman, and Buildah Additional Requirements … s degree or equivalent experience Experience as a DevOps Engineer focused on AWS Strong scripting skills (Bash, Python, Ruby) Desirable: AWS Certifications, RHCE, Elasticsearch, ApacheNiFi familiarity Job Details Seniority level: Mid-Senior level Employment type: Full-time Job function: Engineering and IT Industries: IT Services and Consulting More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
Ansible , PowerShell , and Python . Build, configure, and maintain Kubernetes clusters , ensuring high availability, scalability, and security. Deploy and support key applications such as ApacheNiFi , Elastic ECK , and JFrog Artifactory . Develop and manage Helm Charts to standardise application deployment. Integrate with and manage HashiCorp Vault for … configuration, and operations in production environments. Experience managing applications with Helm Charts and supporting microservices-based architecture. Applications & Services: Direct experience deploying and supporting ApacheNiFi , Elastic ECK (ElasticSearch on Kubernetes) , and Artifactory . Familiarity with container image management and application lifecycle support. Security & Infrastructure: Use of HashiCorp More ❯
modular, reusable services and features within a modern service-oriented architecture Work independently with minimal supervision in an Agile team environment Deploy and maintain ApacheNiFi clusters Develop ETL (Extract, Transform, Load) processes for mission-critical data systems Create and maintain Ansible playbooks and roles for software-driven … deployment of NiFi and Zookeeper clusters Use AWS EC2 instances for infrastructure needs Develop dashboard visualizations in Kibana based on data from Elasticsearch Integrate and interact with RESTful services via REST APIs Requirements: Active Full-Scope Polygraph Expertise with ApacheNiFi and equivalent IC technologies General understanding More ❯
mentoring your team and shaping best practices. Role Lead Technical Execution & Delivery - Design, build, and optimise data pipelines and data infrastructure using Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other technologies. - Break down business requirements into technical solutions and delivery plans. - Lead technical decisions, ensuring alignment with data … You Technical & Engineering Skills - Extensive demonstrable experience in data engineering, with expertise in building scalable data pipelines and infrastructure. - Deep understanding of Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other data technologies. - Strong experience with ETL/ELT processes and data transformation. - Proficiency in SQL, NoSQL, and data More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please More ❯
London, England, United Kingdom Hybrid / WFH Options
Searchability
active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience required in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse … Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please More ❯
Dental, Vision, 401k with company matching, and life insurance. Rate: $80 - $86/hr W2 Responsibilities: Develop, optimize, and maintain data ingestion flows using Apache Kafka, ApacheNifi, and MySQL/PostgreSQL. Develop within AWS cloud services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena. Coordinate … learning techniques. Proficiency in programming languages such as Python, R, and Java. Experience in building modern data pipelines and ETL processes with tools like Apache Kafka and Apache Nifi. Proficiency in Java, Scala, or Python programming. Experience managing or testing API Gateway tools and Rest APIs. Knowledge of More ❯
design patterns to ensure the product's scalability, maintainability, and long-term success. • Understand API-driven microservice design patterns, NoSQL databases, data ingest tools (ApacheNiFi), and modern web frameworks. • Maintain a team player mentality as a collaborative member of a fast-paced, structured 10-14 person team. … Nice to Haves: • NoSQL DBs (Mongo, ElasticSearch, Redis, Graph DB, etc.). • Data wrangling (Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating) with ApacheNiFi or similar tools. • CI/CD (e.g., Jenkins), Junit testing or similar. • Scripting with Bash, Python, and/or Groovy. YOE Requirement More ❯
Cassandra, HBase). Familiarity with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Hands-on experience with ETL frameworks and tools (e.g., ApacheNiFi, Talend, Informatica, Airflow). Knowledge of big data technologies (e.g., Hadoop, Apache Spark, Kafka). Experience with cloud platforms (AWS, Azure … and security protocols. Preferred Qualifications: Experience with machine learning and preparing data for AI/ML model training. Familiarity with stream processing frameworks (e.g., Apache Kafka, Apache Flink). Certification in cloud platforms (e.g., AWS Certified Big Data - Specialty, Google Cloud Professional Data Engineer). Experience with DevOps … practices and CI/CD pipelines for data systems. Experience with automation and orchestration tools (e.g., Apache Airflow, Luigi). Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) to support analytics teams Work Environment: Collaborative and fast-paced work environment. Opportunity to work with state-of More ❯
speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, Apache Accumulo, Apache Spark, Spring, ApacheNiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning More ❯
and verification criteria. The role also involves the application of systems engineering principles in accordance with ISO/IEC 15288 process areas. Experience with ApacheNiFi development and collaboration tools such as Jira, Confluence, and SharePoint are required. Tasks Performed: • Participate in an Integrated Product Team to design … experience. • Bachelor's degree in system engineering, computer science, information systems, engineering science, or engineering management with 7 years of relevant experience. • Experience developing ApacheNiFi applications. • Experience applying systems engineering principles throughout the systems life cycle phases. • Experience interacting with the Government regarding Systems Engineering technical considerations … Process Area: Project Portfolio Management, Infrastructure Management, Lifecycle Model Management, Human Resource Management, Quality Management. -Agreement Process Area: Acquisition and Supply. • Experience as a NiFi developer. • Experience with Jira. • Experience with Confluence, SharePoint, or similar. Other Job Requirements: • Minimum Active Top Secret/SCI security clearance with a Full More ❯
London, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The Role Will Include Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with ApacheNiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with ApacheNiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future … learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using … ETL and ELT technologies such as Apache Kafka and ApacheNifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and More ❯
Experience with data access control, specifically Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) Familiarity with data science platforms (Anaconda, Jupyter, NiFi) Experience with Python, C#, or similar languages will be beneficial Knowledge of modern web development frameworks (Node.js, React, Angular) Experience developing and maintaining complex More ❯