Fort Lauderdale, Florida, United States Hybrid / WFH Options
Vegatron Systems
PARTY WORK AUTHORIZATION: US CITZ or GC,GC - EAD,H4 & L2 EAD, NO H1B & OPT/CPT ACCEPTED BY CLIENT RATE: $OPEN - DOE JOB TITLE: Data Engineer Big Data Hadoop JOB DESCR: Candidates will start out REMOTE WORK and then will eventually be sitting in Frt. Lauderdale, FL. Candidates should be senior Data Engineers with big data tools (Hadoop … with a heavy emphasis on cloud and big data technologies • Healthcare Knowledge and Experience heavily preferred Knowledge and Skills: • Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.) • Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud • Experience with big data tools: Hadoop, Spark, Kafka, etc. • Experience … and Experience heavily preferred • Past experience in executing and delivering solutions in an Agile scrum development environment is preferred • Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.) • Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud • Experience with big data tools: Hadoop, Spark, Kafka, etc. • Experience More ❯
Analytic exposure is a big plus. Java is a must, but these will strengthen your case: Data Analytic development experience Agile development experience Familiarity with/interest in ApacheHadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work on high visibility tasking Willingness/ability More ❯
Experience in one or more of the following: X0Midas, C, C++, FORTRAN, Java, MongoDB, Oracle, Red Hat Linux, Apache, Python, HTML, Dynamic HTML, JavaScript, MySQL, Perl, Extensible Markup Language, Hadoop, Java Message Service, Rails, Esper Abilities and Competencies Must hold TS/SCI clearance w/polygraph (U.S. Citizenship required for clearance) Self-motivated and eager to work intently More ❯
such as JDK, J2EE, EJB, JDBC, and/or Spring, and experience with RESTful APIs Experience developing and performing ETL tasks in a Linux environment Preferred Qualifications: Experience with Hadoop, Hbase, MapReduce Experience with Elasticsearch Experience with NiFi, Kafka, and Zookeeper Clearance Requirements: An active TS/SCI with Polygraph Physical Requirements: Use hands to operate a computer and More ❯
professional experience -Experience with Java, JavaScript, and Python -Experience with Elasticsearch, Kubernetes, NiFi, and RabbitMQ -Experience with building and deploying cloud infrastructure (CloudFormation, Ansible, Puppet) -Experience building APIs (REST, Hadoop, Spark) or automation (Jenkins, Maven) or monitoring technologies (ELK, Grafana, Prometheus) This role is located in Chantilly, VA and requires a TS/SCI + FS Polygraph. More ❯
rapidly respond to alerts that occur. Our core company platform provides the foundation for our projects. Custom applications built on top of our core company platform. Oracle, Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing. Java and Groovy for our back-end applications and data integration tools. Typescript, React/Redux for our web technologies. More ❯
seattle, washington, united states Hybrid / WFH Options
Amazon.com Services LLC
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers More ❯
/Portals, Workflow, Enterprise Service Bus(ESB), Web Services, Message brokers, Relational Databases. Exposure to common industry platforms and programming languages - Appian BPM, IBM WebSphere, Mule, LAMP/JBOSS, HADOOP, Java, Microsoft/.Net is preferred. Masters degree in Computer Science or other related field Experience working with Appian Sites. Java developer with Maven builds experience is a strong More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Wyetech, LLC
users/customers at multiple military locations. Desired Technical Skills Configuration management tools experience a plus (Puppet, Ansible, Chef, etc.) Experience with Big Data technologies is a huge plus (Hadoop, Kafka, Accumulo, Storm, Hortonworks, Cloudera). Experience with Cloud technologies a plus (AWS, Azure) Higher level clearance is desired, but not required. Can consider up to a TS/ More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
setting. Experience defining and documenting architecture for technology implementations leveraging different architectural views. Exposure to common industry platforms and programming languages - Appian BPM, IBM WebSphere, Mule, LAMP/JBOSS, HADOOP, Java, Microsoft/.Net is preferred. Experience in Enterprise Application integration (SOA, ESB) and n-tier client-server architectures preferred Experience working with Appian Sites. Java developer with Maven More ❯
machine learning models and statistical algorithms that support predictive analytics and operational decision-making. Build and optimize ETL/ELT data pipelines using big data tools like Spark and Hadoop to process data at scale. Manage large datasets using modern data lake platforms such as Amazon S3, Databricks, and Snowflake. Extract, clean, and transform data from relational and non … delivering business impact through data. Strong proficiency in programming languages such as Python, R, SQL, Scala, or Java. Proven experience with big data frameworks and tools including Apache Spark, Hadoop, PostgreSQL, and SQL Server. Hands-on experience with cloud-based data lake platforms such as Amazon S3, Databricks, and Snowflake. Familiarity with ML Flow, Jupyter Notebooks, and deep learning More ❯
machine learning models and statistical algorithms that support predictive analytics and operational decision-making. Build and optimize ETL/ELT data pipelines using big data tools like Spark and Hadoop to process data at scale. Manage large datasets using modern data lake platforms such as Amazon S3, Databricks, and Snowflake. Extract, clean, and transform data from relational and non … delivering business impact through data. Strong proficiency in programming languages such as Python, R, SQL, Scala, or Java. Proven experience with big data frameworks and tools including Apache Spark, Hadoop, PostgreSQL, and SQL Server. Hands-on experience with cloud-based data lake platforms such as Amazon S3, Databricks, and Snowflake. Familiarity with ML Flow, Jupyter Notebooks, and deep learning More ❯
Analytic exposure is a big plus. Java is a must, but these will strengthen your case: Data Analytic development experience Agile development experience Familiarity with/interest in ApacheHadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work on high visibility tasking Willingness/ability More ❯
You will play a key role within a small data science team. Client is looking for hands on experience developing solutions for complex data science problems using Python, R, Hadoop, and Greenplum (or other Massively Parallel Processing solutions). REQUIRED SKILLS: Bachelor's Degree in a quantitative or technical field of study, such as Statistics, Mathematics, Computer Science, or More ❯
JavaScript framework (e.g. React, Backbone, AngularJS) • Knowledge of core CS concepts such as common data structures and algorithms • Full stack experience developing in Scala/Python and working with Hadoop and related tools is a plus • Code samples from private github repos, side projects, and open source project contributions is a plus Work Authorization US Citizen Green Card More ❯
substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. -Experience with cloud platforms like Amazon Web Services (AWS) and Microsoft Azure -Knowledge in containerization using Dockers More ❯
of prior relevant experience or a MS degree and 10+ years of prior relevant experience. Preferred Qualifications: • AWS certifications • Experience working with large scale data technologies (e.g. AWS Aurora, Hadoop, Elasticsearch, etc.) • Experience working within a low to high development/deployment environment Benefits and Perks: • Competitive salary and comprehensive benefits package • Commitment to diversity and inclusion in the More ❯
is clearly related to the position Computer Science (CS) degree or related field Experience with Java, Python, C, and Query Time Analytics (QTA) Customer GHOSTMACHINE analytic development Experience with Hadoop (Map Reduce, and Accumulo) Experience with Linux Experience with GEOINT Desired Skills: Familiarity with JIRA and Confluence. Understanding of customer analytical tools' Compensation Range: $198,094.13 - $223,094.13 _ Compensation More ❯
Hands-on experience working with Elasticsearch, Mongo DB, Node, and Docker Hands-on experience with AWS cloud Active TS/SCI clearance with polygraph required Preferred Qualifications Experience with Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi a plus Experience with Sequalize and Umzug Familiarity with or ability to learn Git, Jenkins, FontAwesome, and LeftHook Bachelor's and/ More ❯
CVS) Ability to build applications from source and troubleshoot compiling issues Experience with compilers such as (GNU, Intel, and AOCC) Storage Experience installation and tuning (ZFS, XFS, GPFS, Luster, Hadoop, Ceph, Object Storage) Shell scripting experience (Bash, Perl, Python) Virtualization Experience (VMWare, Xen, Hyper-V, KVM, etc.) Experience with x86 bootstrap process (BIOS, RAID, Fiber Channel, etc.) Experience with More ❯
of the following: C++, C#, C, Java, Ruby, JEE, HTML5, XML, SQL, Qt, Windows, .NET, Unix, Linux, SOA, RTOS, Real-Time Controls, Wireless, Software Security, Robotics, OOA/OOD, Hadoop, Android, Embedded Systems In compliance with pay transparency requirements, the salary range for this role in Colorado state, Hawaii, Illinois, Maryland, Minnesota, New York state, and Vermont is $24.00 More ❯
of the following: C++, C#, C, Java, Ruby, JEE, HTML5, XML, SQL, Qt, Windows, .NET, Unix, Linux, SOA, RTOS, Real-Time Controls, Wireless, Software Security, Robotics, OOA/OOD, Hadoop, Android, Embedded Systems In compliance with pay transparency requirements, the salary range for this role in California, Massachusetts, New Jersey, Washington, and the Greater D.C, Denver, or NYC areas More ❯
of the following: C++, C#, C, Java, Ruby, JEE, HTML5, XML, SQL, Qt, Windows, .NET, Unix, Linux, SOA, RTOS, Real-Time Controls, Wireless, Software Security, Robotics, OOA/OOD, Hadoop, Android, Embedded Systems In compliance with pay transparency requirements, the salary range for this role in California, Massachusetts, New Jersey, Washington, and the Greater D.C, Denver, or NYC areas More ❯