support the US military and space exploration. Hydra-70 illuminating flares • Used to illuminate the battlefield for target detection and identification • Fired from aircraft such as the US Army Apache and the US Air Force F-16 Fighting Falcon • The Hydra-70 rocket system has a MK66 rocket motor and various warheads • The Hydra-70 illumination rounds were first More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
properties, traversals, and relationship patterns essential for effective graph database design • Graph Query Language Proficiency: Demonstrated expertise in one or more graph query languages such as Cypher (Neo4j), Gremlin (Apache TinkerPop), or SPARQL, for complex data retrieval and manipulation • Advanced Graph Data Modeling: Experience translating business requirements into optimized graph schemas, including property graphs, RDF triples, or knowledge graph More ❯
Shawnee Mission, Kansas, United States Hybrid / WFH Options
ECCO Select
mainly remote) Duration: Direct Hire Benefits: Medical/Dental/Vision/401k/PTO/Holidays Job Description: • Design, build, and maintain scalable data pipelines using tools like Apache NiFi, Airflow, or equivalent orchestration systems. • Work with structured and semi-structured data using SQL and NoSQL systems (e.g., PostgreSQL, MongoDB, Elasticsearch, Neo4j). • Develop services and integrations using … data pipeline or ETL contexts; Python is a plus. • Proficiency with SQL and NoSQL databases, including query optimization and large dataset processing. • Familiarity with data integration tools such as Apache NiFi, Airflow, or comparable platforms. • Knowledge of RESTful API interactions, JSON parsing, and schema transformations. • Exposure to cloud environments (especially AWS: S3, EC2, Lambda) and distributed systems. • Comfortable with More ❯
herndon, virginia, united states Hybrid / WFH Options
Maxar Technologies
hybrid cloud/on-prem architecture, AWS, C2S, Openstack, etc. Experience with some big data technologies such as Kubernetes, Spark, Hive, and/or Hadoop, Accumulo, ElasticSearch Experience with Apache NiFi, Apache Airflow, or Kafka An adaptable and solution centric mindset that embraces technology enablers. Familiarity with common industry software tools, concepts, and DevSecOps Experience working with open More ❯
and grow within a broad technology stack. Key Responsibilities Administer, manage, and optimise Linux servers (Ubuntu & CentOS) across physical, virtual, and cloud environments. Configure, maintain, and secure web servers (Apache and Nginx) hosting PHP applications. Implement and manage server security using firewalls (UFW, iptables) and tools such as Fail2Ban. Support and maintain database systems (MySQL and MariaDB). Provide More ❯
system downtime • Participate in weekly status and technical meetings Hands-on expertise in performing the following administration functions: • Linux server administration • RSA configuration • Networking • Mid-level scripting Experience with Apache, Bitbucket, and MAC OS Experience with cloud architectures and technologies Excellent in-person and written communication to include providing executive-level summaries of technical descriptions, recommendations, and issues Forward More ❯
Gerrards Cross, Buckinghamshire, England, United Kingdom
Ikhoi Recruitment
hands-on and enthusiastic person who is quick to learn You'll have at least 2 years 2nd line support experience Experience working with service desk ticketing tools (Jira) Apache or similar server experience Work effectively with a high degree of autonomy Excellent interpersonal and communication skills and enjoy working in a fast-paced environment You will be working More ❯
the AH-64E IETM, AH-64E Operators Manual, AH-64E Maintenance Test Flight Manual, and AH-64E Check-List Manual. Required Skills: • Successfully Completed United States Army AH-64E, Apache Longbow helicopter Aircraft Qualification Course (AQC) or a Contractor equivalent training and qualification approved by the Government Flight Representative (GFR) • Possess a minimum of 1,000 flight hours as More ❯
Test Flight Manual, and AH-64E Check-List Manual. Serve as Pilot in Command (PIC). Basic Qualifications (Required Skills/Experience): Successfully Completed United States Army AH-64E, Apache Longbow helicopter Aircraft Qualification Course (AQC) or a Contractor equivalent training and qualification approved by the Government Flight Representative (GFR) Successfully completed the AH-64 Aviation Maintenance Officers Course More ❯
dell machines/windows 10 workstations - Ideally has understanding of ACAS scans and has done STIGS Nice to Haves -Have knowledge of corporate services including: DNS, SMTP, RHEV, Splunk, Apache -Demonstrated experience managing the installation and maintenance of IT infrastructure -Hardware experience with Dell systems is a plus - Experience working in an environment with rapidly changing job priorities -ServiceNow More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
properties, traversals, and relationship patterns essential for effective graph database design Graph Query Language Proficiency: Demonstrated expertise in one or more graph query languages such as Cypher (Neo4j), Gremlin (Apache TinkerPop), SPARQL, or GraphQL for complex data retrieval and manipulation Advanced Graph Data Modeling: Experience translating business requirements into optimized graph schemas, including property graphs, RDF triples, or knowledge More ❯
dell machines/windows 10 workstations - Ideally has understanding of ACAS scans and has done STIGS Nice to Haves -Have knowledge of corporate services including: DNS, SMTP, RHEV, Splunk, Apache -Demonstrated experience managing the installation and maintenance of IT infrastructure -Hardware experience with Dell systems is a plus - Experience working in an environment with rapidly changing job priorities -ServiceNow More ❯
to see: Open Source Contributions Knowledge of DevOps, Continuous Integration, Automated Testing, and Deployment tools SRE Mindset (debugging, performance, low-latency) Experience working in a Linux environment Experience with Apache Solr or Apache Spark Experience with system design, especially systems that process data at scale Salary Range = 160000 - 240000 USD Annually + Benefits + Bonus The referenced salary More ❯
of high velocity bandwidth, flash speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, Apache Accumulo, Apache Spark, Spring, Apache NiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning, processing intensive analytics, novel algorithm development More ❯
Title: GCP -Data Engineer. Location: Philadelphia PA (Can submit who are willing to relocate) GCP Data Engineer - GCP Dataflow and Apache Beam (Key skills) Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery GFO, Google Analytics Javascript is Must Strong Experience with Dataflow and BigQuery A person should have leading the team … Platforms (preferably GCP) provided Big Data technologies Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, GCP BigQuery, Apache Kafka, Data-Lakes, etc. Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP) Experience in at least one programming More ❯
systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and … ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical and non-technical teams. … Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomaly detection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent company culture founded on Integrity, Innovation and More ❯
The role also involves optimizing database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, Apache Nifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed: • Bridge communication between technical staff and … data between systems, and optimize queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD pipelines. Education, Experience and Qualifications … experience with Data Quality and Data Governance concepts and experience. (Preferred) • Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. (Preferred) • Demonstrated experience with Apache Spark. (Preferred) Other Job Requirements: • Active Top Secret/SCI w/Full Scope Polygraph. • U.S. Citizenship and More ❯
implementation/installation and other applicable technologies. 18. c. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB 19. d. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related More ❯
from an accredited college in a related discipline YEARS OF EXPERIENCE: 8 Years Minimum SKILLS/CERTIFICATIONS: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark. ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS … SQS, SNS, and Systems Manager. Vue.js, ASP.NET (C#), NodejsReact, JavaScript, HTML, CSS, PostgreSQL, Liquibase, Elasticsearch, and Git. Ansible Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch, Splunk AWS SQS Ability to deliver an advanced visual analytic application to include developing data analytics for desktop and web-developed visual analytic software; facilitating the bulk analysis of More ❯
SonarQube, Cypress, PowerShell, C#, and Databricks Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing Experience in AWS solution architecture Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark. ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS … modeling, and advanced analytics Databricks and Lakehouse architectures AWS OpenSearch Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms. Experience with python, Microsoft VBA, and Databricks Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch AWS SQS Informatica and custom software components ElasticSearch and OpenSearch .NET, C#, Javascript, and Java, Python Terraform Experience More ❯
AWS OpenSearch Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms. Experience with python, Microsoft VBA, and Databricks Databricks Alteryx Designer Cloud (previously Trifacta) Microsoft PowerBI Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch AWS SQS ElasticSearch and OpenSearch .NET, C#, Javascript, and Java Terraform Experience developing and maintaining components in AWS … and other source control management systems Software development lifecycle (SDLC) methodologies Unit testing and test-driven development Frontend frameworks (Angular, React, Svelte) Data streaming and integration technologies such as Apache Nifi Infrastructure as Code (Terraform) GraphQL Microservices architecture Experience as a scrum participant and software release processes Available to work after hours when mission requires Communicate work using SMART More ❯