Fairfax, Virginia, United States Hybrid / WFH Options
CGI
Collect and transform structured, unstructured, relational, and NoSQL data using ETL and ELT tools, as well as develop custom code using programming languages. Understand and use distributed methods (e.g., MapReduce) that scale to multi-Terabyte sized data collections. Interpret and evaluate the accuracy of results through iterative, agile methods. Apply data discovery and data visualization tools (e.g., Tableau, Trifacta) to More ❯
Data Engineer - Global Accounts, Professional Services, AWSI-SDT-APJ-Japan Job ID: Amazon Web Services Japan GK Our AWS Professional Services consultants deliver IT infrastructure and application architecture guidance, lead proof-of-concept projects, perform enterprise portfolio assessments, review operational More ❯
Tableau • Experience supporting the development of AI/ML algorithms, such as natural language processing in a production environment • Experience configuring and utilizing data management tools, such as Hadoop, MapReduce, or similar. • Ability to translate complex, technical findings into an easily understood summary in graphical, verbal, or written forms • Must have an active TS/SCI with Favorable Polygraph to More ❯
experience in data engineering or related work. -Proficiency in Java, AWS, Python, Apache Spark, Linux, Git, Maven, and Docker. -Experience maintaining an Apache Hadoop Ecosystem using tools like HBase, MapReduce, and Spark. -Knowledge of ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. -Experience with AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS, SQS, SNS, and Systems More ❯
analysts, and non-technical managers and personnel. (Mandatory) Demonstrated experience with AWS cloud services, including long-term storage options, and cloud-based database services such as Databricks or Elastic MapReduce (EMR). (Mandatory) Demonstrated experience with SQL database structures and mapping between SQL databases. (Mandatory) Demonstrated experience in large-scale data migration efforts. (Mandatory) Demonstrated experience with database architecture, performance More ❯
and non-technical managers and personnel. 2. (Mandatory) Demonstrated experience with AWS cloud services, including long-term storage options, and cloud-based database services such as Databricks or Elastic MapReduce (EMR). 3. (Mandatory) Demonstrated experience with SQL database structures and mapping between SQL databases. 4. (Mandatory) Demonstrated experience in large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with More ❯
and non-technical managers and personnel. 2. (Mandatory) Demonstrated experience with AWS cloud services, including long-term storage options, and cloud-based database services such as Databricks or Elastic MapReduce (EMR). 3. (Mandatory) Demonstrated experience with SQL database structures and mapping between SQL databases. 4. (Mandatory) Demonstrated experience in large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with More ❯
and non-technical managers and personnel. 2. (Mandatory) Demonstrated experience with AWS cloud services, including long-term storage options, and cloud-based database services such as Databricks or Elastic MapReduce (EMR). 3. (Mandatory) Demonstrated experience with SQL database structures and mapping between SQL databases. 4. (Mandatory) Demonstrated experience in large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with More ❯
updating workflow orchestration for the data ingest pipeline to perform API service development and updates. Shall use the following technologies: Relational Data Stores (e.g., Oracle 21c), NiFi, Kafka, Elastic MapReduce (EMR) Hbase, Elastic, Splunk, Java, Python, and Spring to instrument and update the Data Catalog for data metrics, using Splunk and MySQL. REQUIRED QUALIFICATIONS Requires an active Top Secret/ More ❯
Swagger, Git, Subversion, Maven, Jenkins, Gradle, Nexus, Eclipse, IntelliJ, Ext-Js, JQuery, and D3. Cloud technologies: Pig, Hive, Apache Spark, Azure DataBricks, Storm, HBase, Hadoop Distributed File System, and MapReduce Open-source virtual machines and Cloud-based This position is contingent on funding and may not be filled immediately. However, this position is representative of positions within CACI that are More ❯
Systems Engineer - STAR 1912.01.01 Country Intelligence Group is seeking a Full-Time Systems Engineer to support our client in modernizing and integrating large-scale, cloud-based data systems. The selected candidate will serve as a technical liaison among system engineers More ❯
Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Data Analytics and GenAI consulting specialist? Do you have real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake, Data Engineering and GenAI experience? Do More ❯
application software, and system management tools. Demonstrated experience monitoring system performance and troubleshooting. Desired Skills: Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase.1. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/cloud environment familiarity. Desirable: Exposure to big data tools (Spark, Hadoop, MapReduce). Experience with microservice-based data APIs. AWS certifications (Solutions Architect or Big Data Specialty). Knowledge of machine learning or advanced analytics. Interested? This is a great opportunity to More ❯
Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/cloud environment familiarity. Desirable: Exposure to big data tools (Spark, Hadoop, MapReduce). Experience with microservice-based data APIs. AWS certifications (Solutions Architect or Big Data Specialty). Knowledge of machine learning or advanced analytics. Interested? This is a great opportunity to More ❯
Jenkins Continuous Integration/Continuous Delivery (CI/CD) pipelines with automated testing and deployment. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience maintaining, upgrading, troubleshooting, and managing software, hardware and networks (specifically the hardware networks piece). Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform More ❯
Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with Apache Hadoop and the Hadoop Distributed File System Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence Travel Security Clearance Top Secret/SCI/CI Poly More ❯
/SCI with Polygraph security clearance required Desired Qualifications: Familiarity with AWS CDK Terraform, Packer Design Concepts: REST APIs Programming Languages: JavaScript/NodeJS Processing Tools: Presto/Trino, MapReduce, Hive The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual More ❯
or search, GPU workloads, and distributed storage, including Cloudera • Experience in the development of algorithms leveraging R, Python, SQL, or NoSQL • Experience with Distributed data or computing tools, including MapReduce, Hadoop, Hive, EMR, Spark, Gurobi, or MySQL • Experience with visualization packages, including Plotly, Seaborn, or ggplot2 About Blue Sky Blue Sky Innovative Solutions (Blue Sky) assists its federal, state and More ❯
university An active TS/SCI with polygraph You could also have this Experience using the Atlassian Tool Suite. Experience with development of any of the following; Hadoop, Pig, MapReduce, or HDFS Working knowledge with other object-oriented programming languages such as Java or C++ Working knowledge with Front-end data visualization libraries (i.e., D3.js; Raphael.js, etc.) Salary Range More ❯
Experience using Python (or equivalent) Experience using ML libraries, such as scikit-learn, Experience using data visualization tools Preferred Skills : Experience working with GPUs to develop model Experience with MapReduce programming (Hadoop) Skills with programming languages, such as Java or C/C++ Demonstrated ability to develop experimental and analytic plans for data modeling processes, use of strong baselines, ability More ❯
Company Overview We are a world-class team of professionals who deliver next generation technology and products in robotic and autonomous platforms, ground, soldier, and maritime systems in 50+ locations world-wide. Much of our work contributes to innovative research More ❯
R- Description Leidos has a new and exciting opportunity for a Jr. Java Developer in our National Security Sector's (NSS) Cyber & Analytics Business Area (CABA). Our talented team is at the forefront in Security Engineering, Computer Network Operations More ❯