Gerrards Cross, Buckinghamshire, England, United Kingdom
Ikhoi Recruitment
hands-on and enthusiastic person who is quick to learn You'll have at least 2 years 2nd line support experience Experience working with service desk ticketing tools (Jira) Apache or similar server experience Work effectively with a high degree of autonomy Excellent interpersonal and communication skills and enjoy working in a fast-paced environment You will be working More ❯
the AH-64E IETM, AH-64E Operators Manual, AH-64E Maintenance Test Flight Manual, and AH-64E Check-List Manual. Required Skills: • Successfully Completed United States Army AH-64E, Apache Longbow helicopter Aircraft Qualification Course (AQC) or a Contractor equivalent training and qualification approved by the Government Flight Representative (GFR) • Possess a minimum of 1,000 flight hours as More ❯
Test Flight Manual, and AH-64E Check-List Manual. Serve as Pilot in Command (PIC). Basic Qualifications (Required Skills/Experience): Successfully Completed United States Army AH-64E, Apache Longbow helicopter Aircraft Qualification Course (AQC) or a Contractor equivalent training and qualification approved by the Government Flight Representative (GFR) Successfully completed the AH-64 Aviation Maintenance Officers Course More ❯
dell machines/windows 10 workstations - Ideally has understanding of ACAS scans and has done STIGS Nice to Haves -Have knowledge of corporate services including: DNS, SMTP, RHEV, Splunk, Apache -Demonstrated experience managing the installation and maintenance of IT infrastructure -Hardware experience with Dell systems is a plus - Experience working in an environment with rapidly changing job priorities -ServiceNow More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
properties, traversals, and relationship patterns essential for effective graph database design Graph Query Language Proficiency: Demonstrated expertise in one or more graph query languages such as Cypher (Neo4j), Gremlin (Apache TinkerPop), SPARQL, or GraphQL for complex data retrieval and manipulation Advanced Graph Data Modeling: Experience translating business requirements into optimized graph schemas, including property graphs, RDF triples, or knowledge More ❯
dell machines/windows 10 workstations - Ideally has understanding of ACAS scans and has done STIGS Nice to Haves -Have knowledge of corporate services including: DNS, SMTP, RHEV, Splunk, Apache -Demonstrated experience managing the installation and maintenance of IT infrastructure -Hardware experience with Dell systems is a plus - Experience working in an environment with rapidly changing job priorities -ServiceNow More ❯
to see: Open Source Contributions Knowledge of DevOps, Continuous Integration, Automated Testing, and Deployment tools SRE Mindset (debugging, performance, low-latency) Experience working in a Linux environment Experience with Apache Solr or Apache Spark Experience with system design, especially systems that process data at scale Salary Range = 160000 - 240000 USD Annually + Benefits + Bonus The referenced salary More ❯
of high velocity bandwidth, flash speed disk, high density multi-CPU architectures, and extreme memory footprints. You will be working with the latest technologies including Java, Kafka, Kubernetes, Docker, Apache Accumulo, Apache Spark, Spring, Apache NiFi and more! We have multiple opportunities to build systems with capabilities that include machine learning, processing intensive analytics, novel algorithm development More ❯
Title: GCP -Data Engineer Location: Philadelphia PA (Can submit who are willing to relocate) GCP Data Engineer - GCP Dataflow and Apache Beam (Key skills) Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery GFO, Google Analytics Javascript is Must Strong Experience with Dataflow and BigQuery A person should have leading the team … Platforms (preferably GCP) provided Big Data technologies Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, GCP BigQuery, Apache Kafka, Data-Lakes, etc. Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP) Experience in at least one programming More ❯
systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and … ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical and non-technical teams. … Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomaly detection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent company culture founded on Integrity, Innovation and More ❯
The role also involves optimizing database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, Apache Nifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed: • Bridge communication between technical staff and … data between systems, and optimize queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD pipelines. Education, Experience and Qualifications … experience with Data Quality and Data Governance concepts and experience. (Preferred) • Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. (Preferred) • Demonstrated experience with Apache Spark. (Preferred) Other Job Requirements: • Active Top Secret/SCI w/Full Scope Polygraph. • U.S. Citizenship and More ❯
implementation/installation and other applicable technologies. 18. c. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB 19. d. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related More ❯
from an accredited college in a related discipline YEARS OF EXPERIENCE: 8 Years Minimum SKILLS/CERTIFICATIONS: Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.) Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark. ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS … SQS, SNS, and Systems Manager. Vue.js, ASP.NET (C#), NodejsReact, JavaScript, HTML, CSS, PostgreSQL, Liquibase, Elasticsearch, and Git. Ansible Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch, Splunk AWS SQS Ability to deliver an advanced visual analytic application to include developing data analytics for desktop and web-developed visual analytic software; facilitating the bulk analysis of More ❯
SonarQube, Cypress, PowerShell, C#, and Databricks Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing Experience in AWS solution architecture Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark. ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS … modeling, and advanced analytics Databricks and Lakehouse architectures AWS OpenSearch Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms. Experience with python, Microsoft VBA, and Databricks Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch AWS SQS Informatica and custom software components ElasticSearch and OpenSearch .NET, C#, Javascript, and Java, Python Terraform Experience More ❯
AWS OpenSearch Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms. Experience with python, Microsoft VBA, and Databricks Databricks Alteryx Designer Cloud (previously Trifacta) Microsoft PowerBI Apache Niagra Files (NiFi) Apache TIKA Databricks and Lakehouse architecture ElasticSearch AWS SQS ElasticSearch and OpenSearch .NET, C#, Javascript, and Java Terraform Experience developing and maintaining components in AWS … and other source control management systems Software development lifecycle (SDLC) methodologies Unit testing and test-driven development Frontend frameworks (Angular, React, Svelte) Data streaming and integration technologies such as Apache Nifi Infrastructure as Code (Terraform) GraphQL Microservices architecture Experience as a scrum participant and software release processes Available to work after hours when mission requires Communicate work using SMART More ❯
ships, aircraft). Required Experience: Active Secret clearance. 4-7 years in data engineering, preferably within secure or classified environments. Proficiency in Python, Spark, SQL, and orchestration tools like Apache Airflow. Hands-on experience with data serialization formats such as protobuf, Arrow, FlatBuffers, or Cap'n Proto. Familiarity with data storage formats like Parquet or Avro. Experience with modern … analytic storage technologies such as Apache Iceberg or DuckDB. Binary message parsing experience. Strong understanding of classified data handling, secure networking, and compliance in high-side or air-gapped environments. Preferred Experience: Familiarity with IC standards (UDS, IC ITE) and secure cloud environments (e.g., AWS GovCloud, C2S). Experience deploying LLMs or machine learning models within classified network environments More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full time on site in Gloucester when required. Required technical experience in the following: Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
location – full time on site when required Must hold active Enhanced DV Clearance (West) Circa £600 p/d inside IR35 Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
location – full time on site when required Must hold active Enhanced DV Clearance (West) Circa £600 p/d inside IR35 Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … maintain optimal operation. The Data Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Gloucester when required. Required technical experience in the following: Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either apply by clicking online or … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE More ❯
and Agile Release Train (ART) activities Required Technical Skills Expert-level proficiency with: • Python for data processing and ETL workflows • SQL (MySQL, PostgreSQL, Microsoft SQL) • Elasticsearch • Data pipeline technologies (Apache Kafka, Apache Nifi, Cribl) • Splunk for data analysis and visualization • Containerization technologies (Docker, Kubernetes) • Infrastructure as Code (Terraform) • Cloud platforms (AWS GovCloud, SC2S, C2S) • Windows and Linux operating More ❯
Tools: Proficiency with Maven and GitLab. • Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. • Configuration Files: Experience using YAML files for data model and schema configuration. • Apache NiFi: Significant experience with NiFi administration and building/troubleshooting data flows. • AWS S3: bucket administration. • IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise: • ETL creation and … experience in cyber/network security operations. • Familiarity with Agile environments. • Good communication skills. • Developed documentation and training in areas of expertise. • Amazon S3, SQS/SNS Admin experience • Apache Airflow Workloads via UI or CLI a plus • Experience with Mage AI a plus • Kubernetes, Docker Benefits $120-$220,000 salary per year, depending on experience. 11 Federal Holidays More ❯