robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, ApacheNifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
active (West) Globally leading defence/cyber security company Up to £65k DoE - plus benefits and bonuses Cheltenham location – hybrid working model Experience required in Splunk/ELK, Linux, ApacheNiFi, Java/Python, Docker/Kubernetes Who Are We? We are recruiting a Senior Support Engineer to work with a multi-national, industry-leading cyber security/… with tools like Splunk or the ELK stack. Strong ability to manage tasks proactively while adapting to shifting priorities. Proficiency in Linux server administration. Experience with technologies such as ApacheNiFi, MinIO, and AWS S3. Skilled in managing and patching Java and Python applications. Familiarity with containerization tools like Docker or Podman and deployment platforms such as Kubernetes … hearing from you. SENIOR SUPPORT ENGINEER KEY SKILLS: SUPPORT ENGINEER/LINUX/UNIX/AWS/DOCKER/KUBERNETES/PYTHON/ANSIBLE/JAVA/ELK/APACHE/SPLUNK/APACHENIFI/DV CLEARED/DV CLEARANCE/DEVELOPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/CHELTENHAM/SECURITY More ❯
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of ApacheNiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation … and delivery. You should be adept at identifying and resolving issues within NiFi flows, managing performance bottlenecks, and implementing robust error handling strategies. Youll work closely with cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batch processing, real … of use cases. Were looking for candidates with over 3 years of relevant experience in data engineering, platform engineering, or a related field, with demonstrated hands-on expertise in NiFi and data pipeline design in regulated environments. Responsibilities: Design, develop, and maintain robust and secure data pipelines using NiFi and related big data technologies. Troubleshoot and optimize NiFiMore ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of ApacheNiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation … and delivery. You should be adept at identifying and resolving issues within NiFi flows, managing performance bottlenecks, and implementing robust error handling strategies. Youll work closely with cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batch processing, real … of use cases. Were looking for candidates with over 3 years of relevant experience in data engineering, platform engineering, or a related field, with demonstrated hands-on expertise in NiFi and data pipeline design in regulated environments. Responsibilities: Design, develop, and maintain robust and secure data pipelines using NiFi and related big data technologies. Troubleshoot and optimize NiFiMore ❯
The role also involves optimizing database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, ApacheNifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed: • Bridge communication between technical staff … data between systems, and optimize queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD pipelines. Education, Experience and Qualifications … experience with Data Quality and Data Governance concepts and experience. (Preferred) • Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with ApacheNifi and other ETL tools. (Preferred) • Demonstrated experience with Apache Spark. (Preferred) Other Job Requirements: • Active Top Secret/SCI w/Full Scope Polygraph. • U.S. Citizenship More ❯
Gloucester, Gloucestershire, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL, Team Leadership Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the … connect operational systems with analytics and business intelligence platforms. Responsibilities include: Design, build, and maintain data pipelines, including ingestion, orchestration, and enrichment Develop data-streaming and ETL solutions (e.g. NiFi) Model databases and integrate data from diverse sources Ensure data quality, consistency, and security Monitor and optimise system performance Write clean, secure, reusable, test-driven code Apply systems integration … standards across government The Lead Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required experience in the following: Apache Kafka ApacheNiFI SQL and NoSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java Understand and interpret technical and business stakeholder needs Manage More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
and tests. • Leverage development and design patterns to ensure the product's scalability, maintainability, and long-term success. • Understand API-driven microservice design patterns, NoSQL databases, data ingest tools (ApacheNiFi), and modern web frameworks. • Maintain a team player mentality as a collaborative member of a fast-paced, structured 10-14 person team. Skills Requirements: • Proficiency in the … a 10+ person team environment. Nice to Haves: • NoSQL DBs (Mongo, ElasticSearch, Redis, Graph DB, etc.). • Data wrangling (Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating) with ApacheNiFi or similar tools. • CI/CD (e.g., Jenkins), Junit testing or similar. • Scripting with Bash, Python, and/or Groovy. YOE Requirement: 3 yrs., B.S. in a More ❯
of high velocity bandwidth, flash speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, Apache Accumulo, Apache Spark, Spring, ApacheNiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning, processing intensive analytics, novel algorithm More ❯
with libraries such as Pandas, NumPy, and FastAPI. Experience with weather and climate datasets and tooling (e.g., Copernicus, Xarray, Zarr, NetCDF). Experience with ETL tools and frameworks (e.g., Apache Airflow, ApacheNiFi, Talend). Strong understanding of relational databases and SQL. Experience with cloud platforms (e.g., AWS, GCP, Azure) and their data services. Familiarity with data More ❯
data prep and labeling to enable data analytics. • Familiarity with various log formats such as JSON, XML, and others. • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions) • Ability to decompose technical problems and troubleshoot both system and dataflow issues. • Must be certified DoD IAT II or higher (CompTIA Security+ highly … with Java, including unit and integration testing. • Python: Experience with Python is desired. • SQL: Familiarity with SQL schemas and statements. Tools and Technologies: • Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS. • Version Control and Build Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using … YAML files for data model and schema configuration. • ApacheNiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and processing expertise. • Experience with code debugging concepts • Expertise in data modeling design, troubleshooting, and analysis from ingest to visualization. More ❯
data prep and labeling to enable data analytics. • Familiarity with various log formats such as JSON, XML, and others. • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions) • Ability to decompose technical problems and troubleshoot both system and dataflow issues. • Must be certified DoD IAT II or higher (CompTIA Security+ highly … with Java, including unit and integration testing. • Python: Experience with Python is desired. • SQL: Familiarity with SQL schemas and statements. Tools and Technologies: • Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS. • Version Control and Build Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using … YAML files for data model and schema configuration. • ApacheNiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and processing expertise. • Experience with code debugging concepts • Expertise in data modeling design, troubleshooting, and analysis from ingest to visualization. More ❯
source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring … source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such … as Apache Kafka and ApacheNifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and compliance Experience working in an Agile environment Qualifications: Must have an Active Secret clearance or More ❯
an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action More ❯
an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action More ❯
an active Secret clearance; will also accept TS/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action More ❯
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: - Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, - Working knowledge with public keys and digital certificates - Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG - Experience with DevOps environments - Expertise in More ❯
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
with Collaboration tools, such as, Jira and Confluence Desired Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
higher clearance such as Top Secret, TS/SCI, and/or TS/SCI with CI Polygraph Desired Experience: Experience with distributed data platforms and streaming tools (e.g., NiFi, Kafka) - Hands-on experience with cloud platforms such as AWS or Azure Familiarity with containerization and orchestration (e.g., Docker, Kubernetes) Experience with NoSQL databases and full-text search engines More ❯
higher clearance such as Top Secret, TS/SCI, and/or TS/SCI with CI Polygraph Desired Experience: Experience with distributed data platforms and streaming tools (e.g., NiFi, Kafka) - Hands-on experience with cloud platforms such as AWS or Azure Familiarity with containerization and orchestration (e.g., Docker, Kubernetes) Experience with NoSQL databases and full-text search engines More ❯
be willing/able to help open/close the workspace during regular business hours as needed. Preferred Requirements • Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. • Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Benefits $152,000-$198,000 salary per More ❯
mentor engineers, lead code reviews, and communicate technical decisions clearly. Bachelor's in Computer Science, Engineering, or related field. U.S. Citizenship required for clearance eligibility. Preferred Skills/Experience: NiFi, Apache Airflow, or data pipeline tools. Observability tools like Kibana; test frameworks like Selenium. Experience in classified/cloud-isolated environments (C2S, AC2SP). Familiarity with Kafka, Spark More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase.1. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. Demonstrated experience managing and mitigating IT security vulnerabilities using Plans of Actions and Milestones (POAMs). Demonstrated experience More ❯