robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, ApacheNifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
active (West) Globally leading defence/cyber security company Up to £65k DoE - plus benefits and bonuses Cheltenham location – hybrid working model Experience required in Splunk/ELK, Linux, ApacheNiFi, Java/Python, Docker/Kubernetes Who Are We? We are recruiting a Senior Support Engineer to work with a multi-national, industry-leading cyber security/… with tools like Splunk or the ELK stack. Strong ability to manage tasks proactively while adapting to shifting priorities. Proficiency in Linux server administration. Experience with technologies such as ApacheNiFi, MinIO, and AWS S3. Skilled in managing and patching Java and Python applications. Familiarity with containerization tools like Docker or Podman and deployment platforms such as Kubernetes … hearing from you. SENIOR SUPPORT ENGINEER KEY SKILLS: SUPPORT ENGINEER/LINUX/UNIX/AWS/DOCKER/KUBERNETES/PYTHON/ANSIBLE/JAVA/ELK/APACHE/SPLUNK/APACHENIFI/DV CLEARED/DV CLEARANCE/DEVELOPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/CHELTENHAM/SECURITY More ❯
gloucester, south west england, united kingdom Hybrid / WFH Options
Searchability NS&D
active (West) Globally leading defence/cyber security company Up to £65k DoE - plus benefits and bonuses Cheltenham location – hybrid working model Experience required in Splunk/ELK, Linux, ApacheNiFi, Java/Python, Docker/Kubernetes Who Are We? We are recruiting a Senior Support Engineer to work with a multi-national, industry-leading cyber security/… with tools like Splunk or the ELK stack. Strong ability to manage tasks proactively while adapting to shifting priorities. Proficiency in Linux server administration. Experience with technologies such as ApacheNiFi, MinIO, and AWS S3. Skilled in managing and patching Java and Python applications. Familiarity with containerization tools like Docker or Podman and deployment platforms such as Kubernetes … hearing from you. SENIOR SUPPORT ENGINEER KEY SKILLS: SUPPORT ENGINEER/LINUX/UNIX/AWS/DOCKER/KUBERNETES/PYTHON/ANSIBLE/JAVA/ELK/APACHE/SPLUNK/APACHENIFI/DV CLEARED/DV CLEARANCE/DEVELOPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/CHELTENHAM/SECURITY More ❯
The role also involves optimizing database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, ApacheNifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed: • Bridge communication between technical staff … data between systems, and optimize queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD pipelines. Education, Experience and Qualifications … experience with Data Quality and Data Governance concepts and experience. (Preferred) • Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with ApacheNifi and other ETL tools. (Preferred) • Demonstrated experience with Apache Spark. (Preferred) Other Job Requirements: • Active Top Secret/SCI w/Full Scope Polygraph. • U.S. Citizenship More ❯
AWS OpenSearch Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms. Experience with python, Microsoft VBA, and Databricks Databricks Alteryx Designer Cloud (previously Trifacta) Microsoft PowerBI Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch AWS SQS ElasticSearch and OpenSearch .NET, C#, Javascript, and Java Terraform Experience developing and maintaining components in … AWS SpringBoot framework Experience with ELK stack, OpenSearch, SonarQube, Cypress, PowerShell, C#, and Databricks Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing Experience in AWS solution architecture Experience in ElasticSearch, Vue.js, Java Spring Framework, and .NET Framework web services. Vue3, Typescript, Jupyter Notebook, Scala, Databricks Node.js, Angular JS, HTML, CSS, Java … and other source control management systems Software development lifecycle (SDLC) methodologies Unit testing and test-driven development Frontend frameworks (Angular, React, Svelte) Data streaming and integration technologies such as ApacheNifi Infrastructure as Code (Terraform) GraphQL Microservices architecture Experience as a scrum participant and software release processes Available to work after hours when mission requires Communicate work using More ❯
Shawnee Mission, Kansas, United States Hybrid / WFH Options
ECCO Select
mainly remote) Duration: Direct Hire Benefits: Medical/Dental/Vision/401k/PTO/Holidays Job Description: • Design, build, and maintain scalable data pipelines using tools like ApacheNiFi, Airflow, or equivalent orchestration systems. • Work with structured and semi-structured data using SQL and NoSQL systems (e.g., PostgreSQL, MongoDB, Elasticsearch, Neo4j). • Develop services and integrations … data pipeline or ETL contexts; Python is a plus. • Proficiency with SQL and NoSQL databases, including query optimization and large dataset processing. • Familiarity with data integration tools such as ApacheNiFi, Airflow, or comparable platforms. • Knowledge of RESTful API interactions, JSON parsing, and schema transformations. • Exposure to cloud environments (especially AWS: S3, EC2, Lambda) and distributed systems. • Comfortable More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
location – full time on site when required Must hold active Enhanced DV Clearance (West) Circa £600 p/d inside IR35 Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
location – full time on site when required Must hold active Enhanced DV Clearance (West) Circa £600 p/d inside IR35 Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online More ❯
and tests. • Leverage development and design patterns to ensure the product's scalability, maintainability, and long-term success. • Understand API-driven microservice design patterns, NoSQL databases, data ingest tools (ApacheNiFi), and modern web frameworks. • Maintain a team player mentality as a collaborative member of a fast-paced, structured 10-14 person team. Skills Requirements: • Proficiency in the … a 10+ person team environment. Nice to Haves: • NoSQL DBs (Mongo, ElasticSearch, Redis, Graph DB, etc.). • Data wrangling (Discovery, Mining, Cleaning, Exploration, Modeling, Structuring, Enriching, and Validating) with ApacheNiFi or similar tools. • CI/CD (e.g., Jenkins), Junit testing or similar. • Scripting with Bash, Python, and/or Groovy. YOE Requirement: 3 yrs., B.S. in a More ❯
of high velocity bandwidth, flash speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, Apache Accumulo, Apache Spark, Spring, ApacheNiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning, processing intensive analytics, novel algorithm More ❯
CERTIFICATIONS: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Experience with ELK stack, OpenSearch, SonarQube, Cypress, PowerShell, C#, and Databricks Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing Experience in AWS solution architecture Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark. ETL processes utilizing Linux … shell scripting, Perl, Python, and Apache Airflow. AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS, SQS, SNS, and Systems Manager. Experience in mobile platform development, such as web mobile, Android, microservices Working knowledge using proxies and proxy creation Vue.js, ASP.NET (C#), NodejsReact, JavaScript, HTML, CSS, PostgreSQL, Liquibase, Elasticsearch, and Git. Vue.js, .NET, Postgres, Oracle DB AWS CloudFormation Maintenance … modeling, and advanced analytics Databricks and Lakehouse architectures AWS OpenSearch Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms. Experience with python, Microsoft VBA, and Databricks Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch AWS SQS Informatica and custom software components ElasticSearch and OpenSearch .NET, C#, Javascript, and Java, Python Terraform More ❯
and Agile Release Train (ART) activities Required Technical Skills Expert-level proficiency with: • Python for data processing and ETL workflows • SQL (MySQL, PostgreSQL, Microsoft SQL) • Elasticsearch • Data pipeline technologies (Apache Kafka, ApacheNifi, Cribl) • Splunk for data analysis and visualization • Containerization technologies (Docker, Kubernetes) • Infrastructure as Code (Terraform) • Cloud platforms (AWS GovCloud, SC2S, C2S) • Windows and Linux More ❯
from an accredited college in a related discipline YEARS OF EXPERIENCE: 8 Years Minimum SKILLS/CERTIFICATIONS: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark. ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS … SQS, SNS, and Systems Manager. Vue.js, ASP.NET (C#), NodejsReact, JavaScript, HTML, CSS, PostgreSQL, Liquibase, Elasticsearch, and Git. Ansible Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch, Splunk AWS SQS Ability to deliver an advanced visual analytic application to include developing data analytics for desktop and web-developed visual analytic software; facilitating the bulk analysis More ❯
data prep and labeling to enable data analytics. • Familiarity with various log formats such as JSON, XML, and others. • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3 and SQS solutions) • Ability to decompose technical problems and troubleshoot both system and dataflow issues. • Must be certified DoD IAT II or higher (CompTIA Security+ highly … with Java, including unit and integration testing. • Python: Experience with Python is desired. • SQL: Familiarity with SQL schemas and statements. Tools and Technologies: • Data Flow Solutions: Experience with Kafka, NiFi, AWS S3, and SQS. • Version Control and Build Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using … YAML files for data model and schema configuration. • ApacheNiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and processing expertise. • Experience with code debugging concepts • Expertise in data modeling design, troubleshooting, and analysis from ingest to visualization. More ❯
source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring … source tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such … as Apache Kafka and ApacheNifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and compliance Experience working in an Agile environment Qualifications: Must have an Active Secret clearance or More ❯
Mc Lean, Virginia, United States Hybrid / WFH Options
ECCO Select
interdisciplinary teams. Awareness of ethical considerations and responsible AI practices. Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment. Familiarity with NiFi, Airflow, or other data pipeline orchestration tools (not required, nice to have). Exposure to Kubernetes, IAM/security models, or Spark/Kafka environments (not required, nice to More ❯
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: - Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, - Working knowledge with public keys and digital certificates - Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG - Experience with DevOps environments - Expertise in More ❯
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
with Collaboration tools, such as, Jira and Confluence Desired Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
/C2S). Familiar with Amazon Web Managed Services (AWS). Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar. Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML. Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis. Familiar with Linux More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase.1. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. Demonstrated experience managing and mitigating IT security vulnerabilities using Plans of Actions and Milestones (POAMs). Demonstrated experience More ❯
data solutions. • Work with relational, NoSQL, and cloud-based data platforms (PostgreSQL, MySQL, MongoDB, Elasticsearch, AWS/GCP data services). • Support data integration and transformation using modern tools (ApacheNiFi, Kafka, ETL pipelines). • Contribute to DevOps and test automation processes by supporting CI/CD pipelines, version control (Git), and containerized/cloud environments. • Perform data More ❯