robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, ApacheNifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
active (West) Globally leading defence/cyber security company Up to £65k DoE - plus benefits and bonuses Cheltenham location – hybrid working model Experience required in Splunk/ELK, Linux, ApacheNiFi, Java/Python, Docker/Kubernetes Who Are We? We are recruiting a Senior Support Engineer to work with a multi-national, industry-leading cyber security/… with tools like Splunk or the ELK stack. Strong ability to manage tasks proactively while adapting to shifting priorities. Proficiency in Linux server administration. Experience with technologies such as ApacheNiFi, MinIO, and AWS S3. Skilled in managing and patching Java and Python applications. Familiarity with containerization tools like Docker or Podman and deployment platforms such as Kubernetes … hearing from you. SENIOR SUPPORT ENGINEER KEY SKILLS: SUPPORT ENGINEER/LINUX/UNIX/AWS/DOCKER/KUBERNETES/PYTHON/ANSIBLE/JAVA/ELK/APACHE/SPLUNK/APACHENIFI/DV CLEARED/DV CLEARANCE/DEVELOPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/CHELTENHAM/SECURITY More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of ApacheNiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation … and delivery. You should be adept at identifying and resolving issues within NiFi flows, managing performance bottlenecks, and implementing robust error handling strategies. Youll work closely with cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batch processing, real … of use cases. Were looking for candidates with over 3 years of relevant experience in data engineering, platform engineering, or a related field, with demonstrated hands-on expertise in NiFi and data pipeline design in regulated environments. Responsibilities: Design, develop, and maintain robust and secure data pipelines using NiFi and related big data technologies. Troubleshoot and optimize NiFiMore ❯
The role also involves optimizing database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, ApacheNifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed: • Bridge communication between technical staff … data between systems, and optimize queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD pipelines. Education, Experience and Qualifications … experience with Data Quality and Data Governance concepts and experience. (Preferred) • Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with ApacheNifi and other ETL tools. (Preferred) • Demonstrated experience with Apache Spark. (Preferred) Other Job Requirements: • Active Top Secret/SCI w/Full Scope Polygraph. • U.S. Citizenship More ❯
as System engineers to support both data and application integrations using bespoke tools written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/Snaplogic, ApacheNIFI, and Kafka, ensuring a robust, well-modelled, and scalable data analytics infrastructure running on MySQL and Postgres style databases primarily. Requirements: Advanced SQL development and deep understanding … integration (REST/SOAP) Proficiency in at least 1 object/procedural/functional language (e.g: Java, PHP, Python) Familiarity with EAI tools such as MuleSoft/SnapLogic or ApacheNiFi Experience with infrastructure-as-code tools such as Terraform and Ansible Experience with version control (e.g. Git, SVN) and CI/CD workflows for deployment Experience scraping More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
and tests. • Leverage development and design patterns to ensure the product's scalability, maintainability, and long-term success. • Understand API-driven microservice design patterns, NoSQL databases, data ingest tools (ApacheNiFi), and modern web frameworks. • Maintain a team player mentality as a collaborative member of a fast-paced, structured 10-14 person team. Skills Requirements: • Proficiency in the … a 10+ person team environment. Nice to Haves: • NoSQL DBs (Mongo, ElasticSearch, Redis, Graph DB, etc.). • Data wrangling (Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating) with ApacheNiFi or similar tools. • CI/CD (e.g., Jenkins), Junit testing or similar. • Scripting with Bash, Python, and/or Groovy. YOE Requirement: 3 yrs., B.S. in a More ❯
of high velocity bandwidth, flash speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, Apache Accumulo, Apache Spark, Spring, ApacheNiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning, processing intensive analytics, novel algorithm More ❯
systems meet rigorous information assurance and verification criteria. The role also involves the application of systems engineering principles in accordance with ISO/IEC 15288 process areas. Experience with ApacheNiFi development and collaboration tools such as Jira, Confluence, and SharePoint are required. Tasks Performed: • Participate in an Integrated Product Team to design new capabilities based upon evaluation … least 12 years of relevant experience. • Bachelor's degree in system engineering, computer science, information systems, engineering science, or engineering management with 7 years of relevant experience. • Experience developing ApacheNiFi applications. • Experience applying systems engineering principles throughout the systems life cycle phases. • Experience interacting with the Government regarding Systems Engineering technical considerations and for associated problems, issues … Measurement. -Enterprise (Organizational Project-Enabling) Process Area: Project Portfolio Management, Infrastructure Management, Lifecycle Model Management, Human Resource Management, Quality Management. -Agreement Process Area: Acquisition and Supply. • Experience as a NiFi developer. • Experience with Jira. • Experience with Confluence, SharePoint, or similar. Other Job Requirements: • Minimum Active Top Secret/SCI security clearance with a Full Scope Polygraph. • U.S. Citizenship, and More ❯
Cloud usage VMWare usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused: Data Pipeline Orchestration and ELT tooling such as Apache Airflow, ApacheNiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3, MinIO, LakeFS CI/CD More ❯
data prep and labeling to enable data analytics. • Familiarity with various log formats such as JSON, XML, and others. • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions) • Ability to decompose technical problems and troubleshoot both system and dataflow issues. • Must be certified DoD IAT II or higher (CompTIA Security+ highly … with Java, including unit and integration testing. • Python: Experience with Python is desired. • SQL: Familiarity with SQL schemas and statements. Tools and Technologies: • Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS. • Version Control and Build Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using … YAML files for data model and schema configuration. • ApacheNiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and processing expertise. • Experience with code debugging concepts • Expertise in data modeling design, troubleshooting, and analysis from ingest to visualization. More ❯
data prep and labeling to enable data analytics. • Familiarity with various log formats such as JSON, XML, and others. • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions) • Ability to decompose technical problems and troubleshoot both system and dataflow issues. • Must be certified DoD IAT II or higher (CompTIA Security+ highly … with Java, including unit and integration testing. • Python: Experience with Python is desired. • SQL: Familiarity with SQL schemas and statements. Tools and Technologies: • Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS. • Version Control and Build Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using … YAML files for data model and schema configuration. • ApacheNiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and processing expertise. • Experience with code debugging concepts • Expertise in data modeling design, troubleshooting, and analysis from ingest to visualization. More ❯
source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring … source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such … as Apache Kafka and ApacheNifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and compliance Experience working in an Agile environment Qualifications: Must have an Active Secret clearance or More ❯
an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action More ❯
an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action More ❯
an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action More ❯
Remote Desktop Protocol (RDP) technologies Experience with data access control, specifically Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) Familiarity with data science platforms (Anaconda, Jupyter, NiFi) Experience with Python, C#, or similar languages will be beneficial Knowledge of modern web development frameworks (Node.js, React, Angular) Experience developing and maintaining complex systems that involves integrating open More ❯
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: - Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, - Working knowledge with public keys and digital certificates - Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG - Experience with DevOps environments - Expertise in More ❯
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
with Collaboration tools, such as, Jira and Confluence Desired Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
higher clearance such as Top Secret, TS/SCI, and/or TS/SCI with CI Polygraph Desired Experience: Experience with distributed data platforms and streaming tools (e.g., NiFi, Kafka) - Hands-on experience with cloud platforms such as AWS or Azure Familiarity with containerization and orchestration (e.g., Docker, Kubernetes) Experience with NoSQL databases and full-text search engines More ❯
higher clearance such as Top Secret, TS/SCI, and/or TS/SCI with CI Polygraph Desired Experience: Experience with distributed data platforms and streaming tools (e.g., NiFi, Kafka) - Hands-on experience with cloud platforms such as AWS or Azure Familiarity with containerization and orchestration (e.g., Docker, Kubernetes) Experience with NoSQL databases and full-text search engines More ❯
be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements • Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. • Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Benefits $152,000-$198,000 salary per More ❯