development of AI/ML algorithms, such as natural language processing in a production environment • Experience configuring and utilizing data management tools, such as Hadoop, MapReduce, or similar. • Ability to translate complex, technical findings into an easily understood summary in graphical, verbal, or written forms • Must have an active More ❯
metadata management skills. Ability to communicate effectively with diverse stakeholders, both technical and non-technical, across different seniority levels. Desired Skills & Certifications: Experience with Hadoop technologies and integrating them into ETL (Extract, Transform, Load) data pipelines. Familiarity with Apache Tika for metadata and text extraction from various document types. More ❯
Marlborough, Massachusetts, United States Hybrid / WFH Options
SMART DESIGN
and statistical modeling Experience with programming languages (Python, R, SQL) Strong understanding of data visualization tools (Tableau, Power BI) Experience with big data technologies (Hadoop, Spark) Strong understanding of data mining techniques and data modeling Excellent problem-solving and communication skills More ❯
AWS SQL APIs Linux Geospatial tools/data Desired Skills: Agile experience delivering on agile teams (Participates in scrum and PI Planning) Docker, Jenkins, Hadoop/Spark, Kibana, Kafka, NiFi, ElasticSearch More ❯
other engineers on the team to elevate technology and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront More ❯
Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Desirable Skills Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions More ❯
Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Desirable Skills Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions More ❯
experience in cloud architecture and implementation Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience Experience in database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) Experience in consulting, design, and implementation of serverless distributed solutions Experience in software development with object-oriented language PREFERRED QUALIFICATIONS AWS More ❯
Java or Python); Software collaboration and revision control (e.g., Git or SVN). Desired skills and experiences: ElasticSearch/Kibana Cloud computing (e.g., AWS) Hadoop/Spark etc. Graph Databases Educational level: Master Degree More ❯
years or more experience; PhD and 15 years or more experience 5 years of hands-on experience with Big Data applications, such as Hadoop and associated applications (e.g., administration, CM, monitoring, performance tuning, etc.) Experience working in a Government mission Data Exploitation environment (e.g., acquiring data, storing data, processing More ❯
Skills: Proficiency with core technical concepts, including data structures, storage systems, cloud infrastructure, and front-end frameworks. Also, familiarity with technologies like Oracle, PostgreSQL, Hadoop, Spark, AWS, or Azure. • Programming Proficiency: Expertise in programming languages such as Java, C++, Python, JavaScript, or similar. • User-Centered Approach: Understanding of how More ❯
ML libraries, such as scikit-learn, Experience using data visualization tools Preferred Skills : Experience working with GPUs to develop model Experience with MapReduce programming (Hadoop) Skills with programming languages, such as Java or C/C++ Demonstrated ability to develop experimental and analytic plans for data modeling processes, use More ❯
level security clearance preferred. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with Kubernetes (or vendor flavor of Kubernetes) Experience with CI/CD pipelines (e.g. Gitlab More ❯
within a high secure environment Expertise with databases and data warehouse technologies such as SQL DBs, NSQL DBs, and other storage options such as Hadoop Se Demonstrated experience developing and implementing logical workflows for complex decision matrices Demonstrated experience with time management and working on multiple initiatives Demonstrated experience More ❯
Master's degree in statistics, data science, or an equivalent quantitative field. - Experience using Cloud Storage and Computing technologies such as AWS Redshift, S3, Hadoop, etc. - Experience with theory and practice of information retrieval, data science, machine learning and data mining. - Experience with theory and practice of design of More ❯
Master's degree in statistics, data science, or an equivalent quantitative field. Experience using Cloud Storage and Computing technologies such as AWS Redshift, S3, Hadoop, etc. Experience with theory and practice of information retrieval, data science, machine learning and data mining. Experience with theory and practice of design of More ❯
services. developing and deploying web services. working with open-source resources in a government computing environment Big data technologies such as Accumulo, Spark, Hive, Hadoop, ElasticSearch Strong Linux skills and familiarity with hybrid cloud/on-prem architecture, AWS, C2S, OpenShift, etc. Can work independently in a fast-paced More ❯
services. developing and deploying web services. working with open-source resources in a government computing environment Big data technologies such as Accumulo, Spark, Hive, Hadoop, ElasticSearch Strong Linux skills and familiarity with hybrid cloud/on-prem architecture, AWS, C2S, OpenShift, etc. Can work independently in a fast-paced More ❯
clearance and SCI eligibility are strongly preferred. Comfortable working with Linux systems on a daily basis. Experience maintaining data pipelines. Cloud technologies such as: Hadoop, Kafka, HBase, Accumulo. Interest in data mining, analytics, and/or machine learning. Familiarity with Intelligence Community and DoD mission sets. CompTIA Security+ certification More ❯
experience in cloud architecture and implementation Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience Experience in database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) Experience in consulting, design and implementation of serverless distributed solutions Experience in software development with object-oriented language AWS experience preferred More ❯
broad IT skill set, including hands-on experience with Linux, AWS, Azure, Oracle 19 (admin), Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of computer More ❯
london, south east england, united kingdom Hybrid / WFH Options
Kantar Media
broad IT skill set, including hands-on experience with Linux, AWS, Azure, Oracle 19 (admin), Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of computer More ❯
understanding of Java and its ecosystems, including experience with popular Java frameworks (e.g. Spring, Hibernate). Familiarity with big data technologies and tools (e.g. Hadoop, Spark, NoSQL databases). Strong experience with Java development, including design, implementation, and testing of large-scale systems. Experience working on public sector projects More ❯