automated testing tools and processes using Jenkins, Cypress testing, GitHub and Jira tools Knowledge and experience with design paradigms such as Microservice architecture, GraphQL, Restful Design and Stream processing (NiFi, Kinesis, etc.) Experience working with SQL Databases (e.g., PostgreSQL, SQL Server) and writing Stored Procedures, Functions and Triggers Experience troubleshooting existing code to resolve application bugs and address technical More ❯
and stakeholders Nice-to-have: Familiarity with agile development methodologies like Scrum is beneficial Experience with the Elastic stack solutioning including indexing, searching and managing data Familiarity with Niagarafiles (NIFI) management Ansible scripting Some of your day-to-day activities include but not limited to: Develops, maintains, and enhances complex and diverse Big-Data Cloud systems based upon documented More ❯
an equivalent combination of education and experience Experience interfacing with databases using SQL or PL/SQL Strong experience with Go or Java Basic C++, Python Experience with Kafka, NiFi , Redis, or RabbitMQ Experience with GitLab, GitLab Runner, CI/CD, Docker, Kubernetes Experience developing RESTful APIs for complex datasets Experience leading projects AWS cloud experience Understanding microservices, client More ❯
an equivalent combination of education and experience Experience interfacing with databases using SQL or PL/SQL Strong experience with Go or Java Basic C++, Python Experience with Kafka, NiFi , Redis, or RabbitMQ Experience with GitLab, GitLab Runner, CI/CD, Docker, Kubernetes Experience developing RESTful APIs for complex datasets Experience leading projects AWS cloud experience Understanding microservices, client More ❯
Role, You'll: • Design, build, and maintain secure, scalable data pipelines and services • Ingest, transform, and model structured and unstructured data for analytics and ML • Work with technologies like Apache Spark, Apache Iceberg, Trino, NiFi, OpenSearch, and AWS EMR • Ensure data integrity, lineage, and security across the entire lifecycle • Collaborate with DevOps to deploy containerized data solutions … premises environments (AWS GovCloud preferred) • Proven ability to process varied data types and formats with agility • Strong documentation and communication skills Bonus Points If You Have: • Experience with ElasticSearch, NiFi, and data mesh principles • Familiarity with data cataloging and metadata management • Background supporting intelligence or national security programs • A bachelor's degree in Computer Science or related field (or More ❯
of hybrid environments utilizing multiple Operating Systems (GNU Linux RHEL Based/Debian Based , Windows Server, BSD Based) - Develop and Maintain Dataflow and Systems utilizing the technology stack of ApacheNiFi, SQL, noSQL, Rest APIs, LDAP, and S3-Compatible Object Storage - Develop and Maintain Backup systems utilizing S3-Compatible Object Storage What Required Skills You'll Bring: - Active More ❯
paced, fluid, and dynamic environment, working with cross functional teams. 6. Demonstrated experience ensuring system performance, scalability, and reliability under high demand conditions. Optional Requirements: 1. Demonstrated experience with ApacheNiFi 2. Demonstrated experience monitoring system performance and troubleshooting. 3. Demonstrated experience installing, configuring, testing, and maintaining operating systems, application software, and system management tools. 4. Demonstrated experience More ❯
orchestration tools Nice-to-have: -Familiarity with agile development methodologies like Scrum is beneficial -Experience with the Elastic stack solutioning including indexing, searching and managing data -Familiarity with Niagarafiles (NIFI) management -Ansible scripting Security Clearance Required: TS/SCI with Poly About Avid Technology Professionals Avid Technology Professionals, LLC (ATP) is a premiere provider of software and systems engineering More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with ApacheNiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with ApacheNiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a More ❯
from an accredited college or university in a technical discipline. 4 additional years of experience may be substituted for a degree Data Flow and Processing: Hands-on experience with ApacheNiFi for building and managing data flows. Search and Analytics: Experience with Elasticsearch for powerful searching and data analytics. Containerization: Proficiency with Docker for developing, shipping, and running More ❯
technologies, including compute, database, storage, network virtualization, and security services. Working understanding of data Extraction, Transformation, and Loading (ETL) techniques and tools such as Azure Data Factory, AWS Glue, ApacheNiFi/CFM, etc. Working understanding of network infrastructure, concepts of operation, and protocols such as TCP/IP, DNS, DHCP, SNMP, Syslog, etc. Demonstrated time management and More ❯
Security+ or equivalent certification and willingness to serve as a privileged user supporting deployed systems testing • 2 years' experience with Kubernetes (Red Hat OpenShift or another variant) • Experience with NiFi • Requirements management • Maintaining systems documentation • Experience with cross-domain systems Benefits: Complete Insurance Coverage Blue Cross Medical, Delta Dental, Vision, Life 401k with Company Contribute Tuition Reimburesemnt Generous Paid More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fortice Ltd
integrations between the data warehouse and other systems. Create deployable data pipelines that are tested and robust using a variety of technologies and techniques depending on the available technologies (Nifi, Spark) Build analytics tools that utilise the data pipeline to provide actionable insights into client requirements, operational efficiency, and other key business performance metrics. Complete onsite client visits and More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as Apache Airflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
candidate will be well-versed in DevSecOps methodologies, have hands-on experience with GitLab (GTLB), and possess in-depth knowledge of cloud and automation tools including AWS, Elastic, and Apache NiFi. Familiarity with the Risk Management Framework (RMF) is required, and prior intelligence community (IC) experience is highly desirable. This position is ideal for a technical leader who thrives … and secure code scanning. Deploy, configure, and monitor systems and microservices within AWS environments. Integrate Elastic Stack for logging, metrics, and operational observability. Build and manage data flows using ApacheNiFi in support of secure data processing pipelines. Collaborate with security and compliance teams to align development practices with RMF requirements. Conduct security assessments and implement automated security … DoD environments. Strong proficiency in GitLab (GTLB) for CI/CD pipelines and code repository management. Hands-on experience with AWS, including deployment, monitoring, and security controls. Familiarity with Apache NiFi. Working knowledge of the Risk Management Framework (RMF) and how it applies to software development. Prior experience supporting DoD programs or projects. Excellent communication and documentation skills with More ❯
improve existing operational data flow processing, distribution, and reliability. Requirements Position Required Skills Experience using the Linux CLI Experience creating, managing, and troubleshooting complex operational data flows Experience using ApacheNiFi canvas to process and distribute data Experience with Corporate data flow processes and tools Experience with Corporate data security and compliance procedures and policies Experience with the More ❯
patterns. Experience working with cloud-based MDM solutions and SaaS platforms. Knowledge of data modeling and relational databases (SQL). Hands-on experience with ETL tools (e.g., Informatica, Talend, ApacheNiFi) and data pipeline creation. Familiarity with security, privacy, and compliance requirements for master data. Strong communication and collaboration skills. Preferred Qualifications: Reltio Certification (e.g., Reltio Certified Professional More ❯
software development experience with a focus on backend or data oriented applications. • Proficiency in Python development. • Experience integrating new technology stacks into existing software environments. • Hands-on experience with ApacheNiFi, Kafka, or Logstash. (Preferred) • Familiarity with SQL and relational database environments. (Preferred) • Experience building modular, scalable software systems. (Preferred) • Understanding of cybersecurity principles and secure software practices. More ❯
substituted for a degree) • 15+ years of relevant experience in software development, ranging from work in a DevOps environment to full stack engineering • Proficiency in the following technologies: • Java • ApacheNiFi workflow configuration and deployment • Databases such as PostgreSQL and MongoDB • Python and machine learning • Docker • Kubernetes • Cloud-like infrastructure • Experience with Jenkins for pipeline integration and deployment More ❯
Leverage AWS best practices to optimize code for cloud deployment with cost-conscious development and deployment strategies Requirements Minimum of 3-5 years' experience with Java Springboot, Management of NiFi Workflows and clusters, Python Regex & Parsing, Data Engineering & ETL , AWS (Lambdas, EC2, EKS, RDS) Developing software within Agile methodologies Preferred Experience developing web APIs to interface with cloud applications More ❯
experience in Data/ML engineering • Strong Experience with ETL, data labeling, and data preparation • 5+ years experience with one or more of the following technologies: Cloudera, DataBricks, Snowflake, NiFi, Python, SQL, Kafka • Strong experience designing, developing, and deploying data solutions in classified AWS cloud environments up to IL6+ • Strong experience designing, implementing, and maintaining data architecture and services More ❯
CERTIFICATIONS: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Experience with ELK stack, OpenSearch, SonarQube, Cypress, PowerShell, C#, and Databricks Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing Experience in AWS solution architecture SharePoint Microsoft Office Suite (Word, PowerPoint, Excel) Documentation tools and content management systems Technical writing standards More ❯
experience briefing technical projects to peers and senior management. Demonstrated experience creating, building and maintaining containers. Demonstrated experience deploying and maintaining Amazon Web Services. Demonstrated experience developing and maintaining NiFi clusters. Desirable Skills Demonstrated experience providing consulting and advisory services for senior IT stakeholders. Demonstrated experience implementing and maintaining ElasticSearch environments. Demonstrated experience building, maintaining and operating terraform for More ❯
Trip to either Nassau Bahamas, Singer Island Florida, Paradise Island Bahamas, or the Cambridge Hyatt Resort Desired Skills: • Proficient in Java • Comfortable working in a Linux environment • Experience with Apache Open Source Hadoop, Apache Open Source Accumulo, Apache Open Source NiFi • Familiarity with Context chaining and Graph theory • Experience with Containerization - Docker, Kubernetes • Experience with Enabling More ❯