display, video, mobile, programmatic, social, native), considering viewability, interaction, and engagement metrics. Create dashboards and deliver usable insights to help steer product roadmaps. Utilize tools such as SQL, R, Hadoop, Excel to hypothesize and perform statistical analysis, AB tests, and experiments to measure the impact of product initiatives on revenue, technical performance, advertiser & reader engagement. Candidates should have analysis More ❯
Experience with cyber work Understanding of internet applications and protocols Experience with both Windows and Linux Ability to use scripting languages Desired Skills CISSP, GIAC, GPEN, CEH, OSCP SQL, Hadoop, or other DBMS experience MetaSploit, Kali/BackTrack, NMap, Snort, Nessus, NetWitness MS in engineering, computer science, computer forensics, or similar Experience with network analysis tools Working knowledge of More ❯
tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as ApacheHadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future growth or to eliminate occurring or … tools, cloud computing, machine learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as ApacheHadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such as More ❯
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … be substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one year of Cloud experience. 2. The following Cloud related experiences are required: 3. Two years of Cloud and/or Distributed Computing … on any particular project (U) Make recommendations for improving documentation and software development process standards (U) Serve as a subject matter expert for Cloud Computing and corresponding technologies including Hadoop assisting the software development team in designing, developing and testing Cloud Computing Systems (U) Debug problems with Cloud based Distributed Computing Frameworks (U) Manage multi-node Cloud based installation More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Base-2 Solutions, LLC
enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for … substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. Desired Skills Familiarity with agile development methodologies like Scrum is beneficial. Experience with the Elastic stack solutioning … being used on any particular project. Make recommendations for improving documentation and software development process standards. Serve as a subject matter expert for Cloud Computing and corresponding technologies including Hadoop assisting the software development team in designing, developing and testing Cloud Computing Systems. Debug problems with Cloud based Distributed Computing Frameworks. Manage multi-node Cloud based installation. Delegate programming More ❯
None Preferred education Bachelor's Degree Required technical and professional expertise Design, develop, and maintain Java-based applications for processing and analyzing large datasets, utilizing frameworks such as ApacheHadoop, Spark, and Kafka. Collaborate with cross-functional teams to define, design, and ship data-intensive features and services. Optimize existing data processing pipelines for efficiency, scalability, and reliability. Develop … s degree in Computer Science, Information Technology, or a related field, or equivalent experience. Experience in Big Data Java development. In-depth knowledge of Big Data frameworks, such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
This is a fantastic opportunity for highly skilled and motivated Lead Data Engineer with strong expertise in data architecture, ETL pipelines, cloud technologies and big data solutions - knowledge of Hadoop and HDFS is 100% required! You MUST have also run a team to be considered for this role - it could be a small or medium sized team but you … Delta Lake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. To be considered for this role you MUST have in-depth experience of Hadoop and HDFS. The role comes with an extensive benefits package including a good pension, good holiday allowance, healthcare, bonus and more! Please note, to be considered for this opportunity … offering sponsorship for this role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, Delta Lake, Data Lake Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable More ❯
and Python. • Support the deployment and management of AWS services including EC2, S3, and IAM. • Work with the team to implement and optimize big data processing frameworks such as Hadoop and Spark. • Help with the integration and use of various compute instances for specific data processing needs. • Contribute to the development of tools and algorithms for data analysis and … experience in data engineering • Bachelor's degree in Data Engineering, Data Science, Computer Science, Information Technology, or a related field OR equivalent practical experience. • Basic knowledge of Spark and Hadoop distributed processing frameworks. • Familiarity with AWS services, particularly EC2, S3, and IAM. • Some experience with programming languages such as Scala, PySpark, Python, and SQL. • Understanding of data pipeline development More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
Job Number 699074BR Description:THE WORK This senior role fosters collaboration with other senior engineers for the development of advanced data analytics solutions and agile development projects in support of a high-visibility mission. This position involves providing technical leadership More ❯
are looking for hardworking engineers who have a real passion for building and shipping great software especially with open source products and leading and contributing to transformational projects around Hadoop and related products. You will be responsible for developing features and frameworks with the focus on scenarios over Hadoop, Spark, and Iceberg. eBay has one of the largest … Hadoop Deployments in the world with over 12000+ nodes and 500+ Petabytes of configured storage. Desirable backgrounds would include strong Java development with technical experience on Big Data stack, with passion for innovating in and around Big Data ecosystems. The Cool Part: Be part of the mission to contribute into and pave the path for the next generation Hadoop … problem solving and analytical skill on both defect and performance issue resolution. Proven result oriented individual with the delivery focus in a high velocity, high-quality environment. Experience in Hadoop ecosystems such as Hadoop, Spark, Iceberg, YuniKorn, etc. Basic Qualifications: Master Degree in Computer Science or a bachelor’s degree with significant proven experience is required Strong Java More ❯
Enterprise class arrays. • Solid State Disk (SSD). • NFS/CIFS based server/storage appliance. • HPSE. • Data Domain and similar deduplication products. • Cloud based storage solutions such as HADOOP, and IBM BigInsights. • Trouble ticket management utilizing Remedy. Requirements IAT Level II Certification Required EQUAL OPPORTUNITY EMPLOYER VETERANS DISABLED More ❯
network Enterprise class arrays Solid State Disk (SSD) NFS/CIFS based server/storage appliance HPSE Data Domain and similar deduplication products Cloud based storage solutions such as HADOOP, and IBM BigInsights Trouble ticket management utilizing Remedy The Benefits Package Wyetech believes in generously supporting employees as they prepare for retirement. The company automatically contributes 20% of each More ❯
with C/C++, Scala, Groovy, Python, and/or shell scripting Javascript development experience with Angular, React, ExtJS and/or Node.js Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi. DevOps experience More ❯
Jenkins, Ansible, Docker, Kubernetes, etc. Desired Experience Knowledge with the following Big Data technologies: Data Ingest (JSON, Kafka, Microservices, Elastic Search), Analytics (HIVE, SPARK, R, PIG, OOZIE workflows), Elasticsearch, Hadoop (HIVE data, OOZIE, Spark, PIG, IMPALA, HUE), COTS Integration (Knowi, MongoDB, Oracle, MySQL RDS, Elastic, Logstash, Kibana, Zookeeper, Consul, HADOOP/HDFS), Docker and Chef More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
the integration of technical systems and ensuring alignment with NCIJTF's technical and analytical requirements. Candidates with a background in data science, cloud technologies (preferably AWS), big data technologies (Hadoop and Spark) as well as system integration, are highly sought after for this position. Responsibilities include: Technical liaison: Lead technical liaison activities between DC3 and NCIJTF to facilitate communication … Big Data-Specialty, Cloudera Certified Professional, or CCSP. Practical experience managing big data solutions, particularly in high-volume analytics and data processing environments. Familiarity with specific tools such as Hadoop, Spark, and/or Apache Kafka. Expertise in one or more of the following metrics tools (e.g., Google Analytics, Tableau), analytics platforms (e.g., SAS, Power BI), or AI tools More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Mexa Solutions LTD
data sources, including SQL and NoSQL databases. Implementing and optimizing data warehouse solutions and ETL/ELT pipelines for analytics and reporting. Working with big data ecosystems such as Hadoop, Spark, and Kafka to build scalable solutions. What you’ll bring... Strong expertise in SQL and NoSQL technologies, such as Oracle, PostgreSQL, MongoDB, or similar. Proven experience with data … warehousing concepts and ETL/ELT tools. Knowledge of big data platforms and streaming tools like Hadoop, Spark, and Kafka. A deep understanding of scalable data architectures, including high availability and fault tolerance. Experience working across hybrid or cloud environments. Excellent communication skills to engage both technical teams and senior stakeholders. What’s in it for you... This is More ❯
About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more. More ❯
with ServiceNow and Splunk Experience supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with ApacheHadoop and the Hadoop Distributed File System Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence Travel Security Clearance Top Secret/SCI More ❯
with C/C++, Scala, Groovy, Python, and/or shell scripting Javascript development experience with Angular, React, ExtJS and/or Node.js Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit MQ, and NiFi. DevOps experience More ❯
annapolis junction, maryland, united states Hybrid / WFH Options
Lockheed Martin
Job ID: 703355BR Date posted: Aug. 27, 2025 Description: THE WORK This senior role fosters collaboration with other senior engineers for the development of advanced data analytics solutions and agile development projects in support of a high-visibility mission. This More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Lockheed Martin
Job Number 703355BR Description:THE WORK This senior role fosters collaboration with other senior engineers for the development of advanced data analytics solutions and agile development projects in support of a high-visibility mission. This position involves providing technical leadership More ❯