InfoBlox experience is highly desirable). Excellent problem-solving, analytical, and communication skills. Desired Competencies: Understanding of cross-domain technologies (Tiger Traps, Garrison, OpsWat, NiFi). Familiarity with VMware infrastructure technologies. Experience using Elastic for monitoring and analytics. This is an exciting opportunity to work with cutting-edge network More ❯
to Twenty-five (25)+ years of relevant experience Experience with Linux and Ciena Experience with cables and collection systems. Understanding of data flow NiFi, Jira and confluence Clearance Required: Must have an active TS/SCI with Polygraph. At Leidos, the opportunities are boundless. We challenge our staff More ❯
InfoBlox experience is highly desirable). - Excellent problem-solving, analytical, and communication skills. Desired Competencies: -Understanding of cross-domain technologies (Tiger Traps, Garrison, OpsWat, NiFi). -Familiarity with VMware infrastructure technologies. -Experience using Elastic for monitoring and analytics. This is an exciting opportunity to work with cutting-edge network More ❯
Experience with Configuration management workflows. Experience Supporting Integration testing. Must currently be DoD 8570 Compliant IAW IAT Level II. Desired: Dataflow Skills/Experience - NiFi, ActiveMQ. Experience with SQL and NoSQL databases. Networking skills that include WAN's, LAN's, routers and switches for troubleshooting. Hands on experience of More ❯
Washington, Washington DC, United States Hybrid / WFH Options
M9 Solutions
M9 Solutions , Washington, DC M9 Solutions is dedicated to providing IT services and solutions to the Federal Government by mobilizing the right people, skills, clearance levels, and technologies to help organizations who desire improved performance and modern, sustainable change. M9 More ❯
/GCP), DWH services (Redshift, Databricks etc.), cloud storages (Azure Storage, S3); Strong experience with ETL/ELT/orchestration tools (Airflow, ADF, Glue, Nifi); Drive the design and implementation of data warehouse and data lakes; Proficient in code versioning (git) and building CI/CD for data projects … Experience with requirement gathering and documentation; Will be a plus: Experience with NoSQL; Experience with stream processing (e.g., Apache Kafka, Apache Flink, Spark Structured Streaming). Responsibilities: Collaborate with business stakeholders and technical teams to understand and analyze data requirements; Lead the design and implementation of data models More ❯
map scalable ELT/ETL flows (batch and streaming) Interface between Data Analysts and Data Scientists Enrich data and load into big data environment (Apache Hadoop on Cloudera) Profil Technical education (Computer Science HTL, Computer Science Degree, Data Science etc.) Experience in data modeling (relational databases and Apache Hadoop) Know-how of core technologies such as SQL (MS SQL Server), Apache Hadoop (Kafka, NiFi, Flink, Scala and/or Java, Python) and Linux Showing interest in working with high-frequency data processing in a near-realtime environment Wir bieten State-of-the-Art Technologies We More ❯
Join us. The world can't wait. You Have: 7+ years of experience working on contracts for the federal government Experience with data flow, ApacheNiFi , or mission systems Experience defining and applying data tagging and handling requirements Experience analyzing and defining data schemas and data processing requirements More ❯
of wired and wireless computer network devices (routers, switches and firewalls). 3+ years of experience with dataflow configuration and tools such as NiagraFiles (NiFi) 3+ years of experience with Windows Active Directory, Windows Operating System (OS), Linux OS, Virtual Machine tools (VMWare) 1+ year of experience with hardware More ❯
Kaizen Approach is currently looking to hire an Applications Engineer (Senior) responsible for designing software tools and subsystems to support software reuse and domain analyses while managing their implementation. This role involves overseeing software development and support using formal specifications More ❯
for each project including ETL mappings, code use guide, code location and access instructions. Design and optimize Data Pipelines using tools such as Spark, Apache Iceberg, Trino, OpenSearch, EMR cloud services, NiFi and Kubernetes containers Ensure the pedigree and provenance of the data is maintained such that the … data storage and processing solutions Python, SQL, Spark and other data engineering programming COTS and open source data engineering tools such as ElasticSearch and NiFi Processing data within the Agile Lifecycle Benefits at Recro 100% paid medical, dental, and vision 401k - 6% matching and 401k profit sharing PTO More ❯
Hanover, Maryland, United States Hybrid / WFH Options
Lockheed Martin
or GICSP or GSEC or Security+ CE or CISSP or CND or SSCP) • Data flow experience (such as but not exclusive to examples like NiFi) • Systems infrastructure knowledge of routers, firewalls, v-lans, servers, etc. • Tools; able to pick up new customized monitoring tools above and beyond ITOPS and More ❯
such as PostgreSQL. Extensive experience using Java for data processing, manipulation, or querying (SQL or NoSQL). ETL/Data Integration experience using Spring, NiFi, Kafka, and Elasticsearch. Experience with development in Commercial Cloud Platforms (e.g., AWS, Google Cloud, Azure). Masters with 15 or more years of relevant … the customer's systems. Understanding of the customer's system development policies. Additional certifications related to ETL and data integration tools. Extensive experience with NiFi and Elasticsearch. At Leidos, the opportunities are boundless. We challenge our staff with interesting assignments that allow them to thrive professionally and personally. For More ❯
including installation, configuration, and maintenance Troubleshooting Skills: Strong troubleshooting skills to diagnose and resolve issues within a Linux environment Desired: Git/GitLab experience Apache open-source products (i.e. NiFi, Kafka, Zookeeper) DevSecOps Risk Management Framework implementation Software development experience (i.e. C++, Java, C) Basic Compensation: $136,000k More ❯
billions of edges) using DynamoDB or new enhanced capabilities. • Experience developing and operating graph traversal capabilities using data graphing tool traversal capabilities built upon Apache Gremlin or new enhanced capabilities. • Experience developing and operating NoSQL solutions to complex big data applications. • Experience in data modeling for performance, partition sharding … Experience building high quality User Interface/User experiences with the React framework and webGL. • Experience designing and operating large scale graph databases using Apache Cassandra. • Experience performing in-depth technical analysis of large-scale graph databases to develop implementation strategies for search optimizations. • Experience developing technical capabilities for … designing, building and operating big data systems, such as persistence, partitioning, indexing, at scale of trillions of records/events. • Experience with Niagara Files (NiFi) applications or new enhanced capabilities. • Experience developing and operating Kubernetes infrastructure. • Experience supporting engineering efforts that will contribute to delivery of capabilities such as More ❯
of billions of edges) using DynamoDB or new enhanced capabilities Developing and operating graph traversal capabilities using data graphing tool traversal capabilities built upon Apache Gremlin or new enhanced capabilities Developing and operating NoSQL solutions to complex big data applications Data modeling for performance, partition sharding, record/event … and PySpark Building high-quality User Interface/User experiences with the React framework and webGL Designing and operating large scale graph databases using Apache Cassandra Performing in-depth technical analysis of large-scale graph databases to develop implementation strategies for search optimizations Developing technical capabilities for processing, persistence … as GitHub Designing, building and operating big data systems, such as persistence, partitioning, indexing, at scale of trillions of records/events Niagara Files (NiFi) applications or new enhanced capabilities Developing and operating Kubernetes infrastructure Supporting engineering efforts that will contribute to delivery of capabilities such as datasets and More ❯
tools. Experience employing spreadsheets for data manipulation and visualization. Developer (MCSD)/Microsoft Certified Solution Expert (MCSE)/Private Cloud/Certified Administrator for Apache Hadoop (CCAH) (Cloudera) Experience with programs and projects and other enterprise initiatives for efforts in the RDT&E phase of the acquisition life cycle … based collaboration tools/platforms for all engineering and business operations efforts in the RDT&E phase of the acquisition life cycle Experience with NiFi and Kafka Education and Certification Requirements: Bachelors degree, STEM, required Relevant industry certification such as: CompTIA Cloud Essentials/Microsoft Technical Associate (MTA)/… of Cloud Security (CCSK)/CompTIA A+/CompTIA Security+/EMC Data Science Associate (EMCDSA)/Cloudera Certified Data Scientist (CCDH)/Certified Apache Hadoop Developer (HCAHD) (Hortonworks)/Certified Information System Security Professional (CISSP)/Certified Cloud Professional (CCP) (Cloudera)/Microsoft Certified Professional Developer (MCPD)/ More ❯
tools. • Experience employing spreadsheets for data manipulation and visualization. • Developer (MCSD)/Microsoft Certified Solution Expert (MCSE)/Private Cloud/Certified Administrator for Apache Hadoop (CCAH) (Cloudera) • Experience with programs and projects and other enterprise initiatives for efforts in the RDT&E phase of the acquisition life cycle … based collaboration tools/platforms for all engineering and business operations efforts in the RDT&E phase of the acquisition life cycle • Experience with NiFi and Kafka Education & Certification Requirements: • Bachelors degree, STEM, required • Relevant industry certification such as: CompTIA Cloud Essentials/Microsoft Technical Associate (MTA)/Certificate … of Cloud Security (CCSK)/CompTIA A+/CompTIA Security+/EMC Data Science Associate (EMCDSA)/Cloudera Certified Data Scientist (CCDH)/Certified Apache Hadoop Developer (HCAHD) (Hortonworks)/Certified Information System Security Professional (CISSP)/Certified Cloud Professional (CCP) (Cloudera)/Microsoft Certified Professional Developer (MCPD)/ More ❯
ll play a key role in building and optimising data pipelines to support trading operations. Key Responsibilities: Build scalable data pipelines with Python and Apache Airflow Develop data models and transformations using dbt Optimize complex SQL queries and performance tune databases Work with MPP databases like Redshift, Greenplum, or … or trading, or a strong interest in the domain. Familiarity with AWS infrastructure (Athena, Parquet, Lakehouse architecture, etc.). Experience with Postgres, ClickHouse, and Apache NiFi. Knowledge of containerization and building Docker-based solutions. Experience with monitoring tools like Prometheus, Grafana, or Zabbix. Package: €60,000 to More ❯
Testing: Bamboo, Jenkins, GitLab Cl/Pipelines Familiarity with microservices software development technique and container-orchestration (e.g., Kubernetes) What Desired Skills You'll Bring: Apache Hadoop, Accumulo, or NiFi One or more of the following certifications: Cloudera Certified Professional (CCP) Data Engineer, Elastic Certified Observability Engineer, Certified Kubernetes More ❯
orchestration for the data ingest pipeline to perform API service development and updates. Shall use the following technologies: Relational Data Stores (e.g., Oracle 21c), NiFi, Kafka, Elastic MapReduce (EMR) Hbase, Elastic, Splunk, Java, Python, and Spring to instrument and update the Data Catalog for data metrics, using Splunk and … and deploying code in a cloud-based environment. Experience using Java, Python, Spring, Kafka, Elastic, and EMR Hbase DESIRED QUALIFICATIONS Experience using Jira, Elasticsearch, NiFi, MySQL, Oracle, Kubernetes or other container solution, or Apache Spark. Overview This role r equires an active Top Secret/SCI + Poly More ❯
Responsibilities & Qualifications RESPONSIBILITIES: Implement registry updates Configure Niagara Files (NiFi) processor/processor groups Route, transform and enrich ad-hoc dataflows Manage NiFi clusters Reverse Engineer customer dataflows and troubleshoot Respond to customer inquiries Advise customers on strategies to more efficiently transport, store and manage data Comply with … Codes Experience using Ubuntu Linux or CentOS to ssh, navigate, move files, find/filter file contents, and troubleshoot application logs Good understanding of Nifi flow file attributes, dataflow processes, and system-to-system connections Experience with FileZilla Familiarity with Pressurewave Experience with data formats such as XML, JSON More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
programming languages such as Java. TS/SCI with Poly Desired Skills: Data Analytic development experience Agile development experience Familiarity with/interest in Apache Hadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work More ❯
big plus. Java is a must, but these will strengthen your case: Data Analytic development experience Agile development experience Familiarity with/interest in Apache Hadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work More ❯