different SQL databases. Demonstrated professional experience working with Apache NiFi. Demonstrated professional experience working with large data and high performance compute clusters such as Hadoop or similar. Demonstrated experience with API development techniques. Demonstrated experience developing and deploying ETL processes for large data sets. Demonstrated experience creating operating system More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
languages such as Java. TS/SCI with Poly Desired Skills: Data Analytic development experience Agile development experience Familiarity with/interest in ApacheHadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work on high More ❯
and Spark to support custom data transformation processes. Integrate and process data from diverse sources into Azure Data Lake and SQL Server. Knowledge of Hadoop is a plus for handling large-scale data processing and storage needs. Utilize prior Property and Casualty (P&C) insurance domain experience to align More ❯
Analyst - Cyber Planner Software Development & System Engineering-Using programming languages Python/JAVA and open source and cloud-based software to include ELK Stack, Hadoop, AWS and Azure. - Enterprise Architect - Software Developer - Video Game Design and Development - Open Source Developer - Mobile Application Developer - Programming & Coding - Cloud Architect - Security and More ❯
for data analysis Qualifications: You hold a master's degree in IT. Minimum of 3 years of experience working with data technologies such as Hadoop, Spark, Hive Strong analytical skills and experience using statistical techniques to interpret data Experience with data visualization and reporting tools such as Tableau and More ❯
Design, develop, test, deploy, and document big data cloud computing workflows TS/SCI clearance with polygraph Experience with Java and Pig Experience with Hadoop (Map Reduce, Accumulo, & Distributed File System (HDFS Experience with Linux What you need to have Bachelor's Degree and 5 to 8 years of More ❯
large scale project management experience 5+ years of continuous integration and continuous delivery (CI/CD) experience 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience 5+ years of software development with object oriented language experience 3+ years of cloud based solution (AWS or equivalent), system More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting More ❯
and understanding of, algorithms for classification, regression, clustering, and anomaly detection o Knowledge of relational databases, including SQL and large-scale distributed systems (e.g. Hadoop) o Expertise with statistical data analysis (e.g. linear models, multivariate analysis, stochastic models, sampling methods) o Demonstrated effectiveness in collecting information and accurately representing More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting More ❯
tools, and validation strategies to support new features Help design/build distributed real-time systems and features Use big data technologies (e.g. Spark, Hadoop, HBase, Cassandra) to build large scale machine learning pipelines Develop new systems on top of real-time streaming technologies (e.g. Kafka, Flink) Minimum Requirements More ❯
learning, data mining and statistics. Traffic quality systems process billions of ad-impressions and clicks per day, by leveraging leading open source technologies like Hadoop, Spark, Redis and Amazon's cloud services like EC2, S3, EMR, DynamoDB and RedShift. The candidate should have reasonable programming and design skills to More ❯
invent, a track record of thought leadership and contributions that have advanced the field. - 2+ years experience with large scale distributed systems such as Hadoop, Spark etc. - Excellent written and spoken communication skills Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran More ❯
global level Prior experience working in data and/or technology- familiarity with tools such as JavaScript, Python & Python Libraries, Structured Query Language (SQL), Hadoop, Snowflake, Qlik, Tableau, Microsoft Excel, Artificial intelligence, Collibra, Manta Consultative experience Track record of solving data integration challenges and building data solutions across complex More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Master's degree in math/statistics/engineering or other equivalent quantitative discipline, or PhD More ❯
in Linux and/or Cloud environments. Preferred Qualifications Demonstrated experience delivering solutions using Cloud technologies, such as AWS, Microsoft Azure, etc. Experience with Hadoop, Hbase, MapReduce. Experience with Elasticsearch. Experience working in a mission environment and/or with many different types of data. Company EEO Statement Accessibility More ❯
Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the More ❯
Java, Python, and/or Ruby Knowledge of virtual networks and general network management functions Cloud database management skills and knowledge of MySQL and Hadoop Technical Responsibilities: Support working and integrating open-source frameworks/products, experience deploying various open-source packages and/or application stacks into the More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. PhD Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make More ❯
Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
validate, deploy) These Qualifications Would be Nice to Have: SIGINT, Cyber, and/or Computer Network Operations (CNO) background. Additional Experience in: Javascript, Vue.js, Hadoop, GM analytic development, .NET environment, debuggers, development of packet level programs, Jupyter Notebooks, Jira, Confluence, Gitlab. $85,000 - $250,000 a year The pay More ❯
Fort Belvoir, Virginia, United States Hybrid / WFH Options
HII Mission Technologies
changes as needed. Preferred Requirements Configuration management tools experience a plus (Puppet, Ansible, Chef, etc.) Experience with Big Data technologies is a huge plus (Hadoop, Kafka, Accumulo, Storm, Hortonworks, Cloudera). Experience with Cloud technologies a plus (AWS, Azure) Higher level clearance is desired, but not required. Can consider More ❯
Experience with clustered or cloud storage deployments. Strong cross-team collaboration and user interaction skills. Expertise in secure, compliant solution planning. Knowledge of HDFS, Hadoop, HBase/Accumulo, and Big Table internals.Join Peraton and play a key role in securing mission-critical systems with cutting-edge solutions. Are you More ❯
experience - 7+ years of software development with object oriented language experience technical specialist, design and architecture experience - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience with consulting, design and implementation of serverless distributed solutions experience - Current, active US Government Security Clearance of TS/SCI More ❯
tools. • Messaging & Streaming: Exposure to RabbitMQ or similar message/streaming broker technologies. • Advanced Technologies: Interest or experience in big data technologies (e.g., HBase, Hadoop), machine learning frameworks (e.g., Spark), and orbit dynamics. Why Join Us? • Innovative Environment: Be part of projects at the cutting edge of space systems More ❯