Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
with Collaboration tools, such as, Jira and Confluence Preferred Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience with DevOps environments Expertise in More ❯
our data systems, all while mentoring your team and shaping best practices. Role Lead Technical Execution & Delivery - Design, build, and optimise data pipelines and data infrastructure using Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other technologies. - Break down business requirements into technical solutions and delivery plans. - Lead technical decisions, ensuring alignment with data architecture and performance best practices. … data pipeline efficiency. All About You Technical & Engineering Skills - Extensive demonstrable experience in data engineering, with expertise in building scalable data pipelines and infrastructure. - Deep understanding of Snowflake, Hadoop, ApacheNiFi, Spark, Python, and other data technologies. - Strong experience with ETL/ELT processes and data transformation. - Proficiency in SQL, NoSQL, and data modeling. - Familiarity with cloud data More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … complex challenges, utilising distributed computing techniques to handle large-scale, real-time, and unstructured data. Responsibilities include: Design and develop data pipelines, including ingestion, orchestration, and ETL processing (e.g., NiFi). Ensure data consistency, quality, and security across all processes. Create and maintain database schemas and data models. Integrate and enrich data from diverse sources, maintaining data integrity. Maintain … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE Seniority level Seniority level Not Applicable Employment type Employment type Full-time More ❯
standards. Develop and deliver documentation for each project including ETL mappings, code use guide, code location and access instructions. Design and optimize Data Pipelines using tools such as Spark, Apache Iceberg, Trino, OpenSearch, EMR cloud services, NiFi and Kubernetes containers Ensure the pedigree and provenance of the data is maintained such that the access to data is protected … tools and services Experience with cloud Experience with Python, SQL, Spark and other data engineering programming Experience with COTS and open source data engineering tools such as ElasticSearch and NiFi Experience with processing data within the Agile Lifecycle Security Clearance: Active TS/SCI with polygraph clearance As required by local law, Accenture Federal Services provides reasonable ranges of More ❯
London, England, United Kingdom Hybrid / WFH Options
Qh4 Consulting
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as Apache Hive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience maintaining, upgrading, troubleshooting, and managing software, hardware and networks (specifically the hardware networks piece). Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience maintaining, upgrading, troubleshooting, and managing software, hardware and networks (specifically the hardware networks piece). Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. More ❯
Interface with customers to gather requirements, define data specifications, and provide solutions, including converting raw data into usable formats. Work with technical writers and testers. Minimum Qualifications: • Familiar with ApacheNIFI • Educational & Certification Requirements: Bachelor's degree with 4 years of related experience (or 4 years of experience in lieu of degree). Requires IAT Level II certification. … Technical Expertise: Demonstrated knowledge of JavaScript frameworks, API development, Agile/DevSecOps practices, containerization (Docker, Podman, Kubernetes), Linux system administration, and scripting (Python, Groovy, Java). Experience with NiFi data flow and troubleshooting. • Data & Intelligence Experience: Experience with DataOps lifecycle (ingestion, ETL) and SIGINT/All Source Intelligence reporting standards. Ability to work with various data formats (XML, JSON More ❯
teams • Mentor junior developers Requirements: • British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet all requirements still feel free More ❯
teams • Mentor junior developers Requirements: • British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet all requirements still feel free More ❯
teams • Mentor junior developers Requirements: • British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, ApacheNiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet all requirements still feel free More ❯
into epics and stories. Writing clean, secure code following a test-driven approach. Monitor and maintain - Monitor data systems for performance issues and make any necessary updates. Required Skills Apache Kafka ApacheNiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java. Built on over a 60 year heritage, Roke offers More ❯
in Agile (SCRUM) teams Requirements: British-born sole UK National with active SC or DV Clearance Strong Java skills, familiarity with Python Experience in Linux, Git, CI/CD, ApacheNiFi Knowledge of Oracle, MongoDB, React, Elasticsearch Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) If you do not meet all requirements still feel free to apply. Benefits More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
tools like JUnit, Git, Jira, MongoDB, and React Familiarity with cloud platforms (especially AWS), microservices, and containerisation DV clearance (or eligibility to obtain it) Nice to Have: Experience with ApacheNiFi, JSF, Hibernate, Elasticsearch, Kibana, or AWS services like EC2, Lambda, EKS CI/CD pipeline expertise using GitLab Knowledge of secure, scalable architectures for cloud deployments O.K. More ❯
/Python Experience performing System Integration tasks including installation, configuration, and sustainment of various COTS/GOTS/FOSS software, packages, and libraries in a Unix environment Experience using ApacheNiFi to process and distribute data Experience with Corporate data flow processes and tools Experience with NoSQL databases including Elasticsearch and MongoDB Experience with containerization technologies such as More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as Apache Hive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as Apache Hive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
London, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
set high standards. What You’ll Bring: Strong experience in AWS cloud platform architecture and solving complex business issues. Proficiency in Java programming and Linux, with desirable knowledge of ApacheNiFi, Node.js, JSON/XML, Jenkins, Maven, BitBucket, or JIRA. Hands-on experience with scripting (Shell, Bash, Python) and a solid understanding of Linux, networking, routing, and firewalls. More ❯
AWS/C2S. Experience with one or more of the following is desired, but not required: Kubernetes, Docker Swarm, ELK stack, Python, Shell Scripting, Java, Spring, Kafka, Pega, Nexus, Nifi, Ansible, Terraform. More ❯
Groovy, Python, and/or shell scripting • Javascript development experience with Angular, React, ExtJS and/or Node.js • Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus • Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi. • DevOps experience building and deploying cloud infrastructure with More ❯
Groovy, Python, and/or shell scripting • Javascript development experience with Angular, React, ExtJS and/or Node.js • Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus • Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi. • DevOps experience building and deploying cloud infrastructure with More ❯
enterprise Continuous Integration/Continuous Deployment processes and best practices. Codify software development best practices across the enterprise. Develop automated testing frameworks within DevOps processes. Basic Qualifications: Experience with NiFi, Spark/MapReduce, Python, Java, and Elasticsearch. Experience coding within the last 3 years. Candidate must have a BS with 4-8 years of prior relevant experience or an More ❯
Data Engineering Extensive familiarity with AWS services to include CloudFormation, EC2, S3, and RDS CloudFormation, Ansible, Git, Jenkins, Bash Programming Languages: Python, Java or Scala Processing Tools: Elasticsearch, Spark, NiFi, and/or Docker Datastore Types: Graph, NoSQL, and/or Relational US Citizenship and an active TS/SCI with Polygraph security clearance required Desired Qualifications: Familiarity with More ❯