Telford, Shropshire, United Kingdom Hybrid / WFH Options
Randstad Technologies Recruitment
its data science/ML libraries Agile delivery experience with task estimation and dependency management Excellent communication and collaboration skills Bonus if you have: Hadoop/Jenkins experience Azure or AWS certifications Familiarity with Java Why Join? This employer offers more than just a job - it's a place More ❯
and rapidly growing finance user base, come join us! BASIC QUALIFICATIONS - 4+ years of data engineering experience - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines PREFERRED QUALIFICATIONS - Experience with AWS technologies like Redshift, S3 More ❯
senior management. Proficiency in Python or R for advanced analytics and data processing. Familiarity with cloud platforms (AWS, GCP) or big data tools (Spark, Hadoop). Nice-to-Have: Previous experience in the travel-tech industry is a plus. Why Join PassNfly? Impact : Shape data systems that will improve More ❯
/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience as a data engineer or related specialty (e.g., software engineer, business intelligence engineer, data scientist) with a track record of More ❯
. Knowledge of cloud platforms (e.g., Azure). Familiarity with containerization is a plus (e.g., Docker, Kubernetes). Knowledge of big data technologies (e.g., Hadoop, Spark). Knowledge of data lifecycle management. Strong problem-solving skills and attention to detail. Ability to work in an agile development environment. Excellent More ❯
Java, Scala, Python, and Golang. Supporting experience to execute against database technologies such as PostgreSQL. Supporting experience to execute against cloud technologies such as Hadoop, Kafka, HBase, Accumulo. Experienced with full software lifecycle development. PREFERRED SKILLS AND QUALIFICATIONS: CI/CD pipelines and tooling (Gitlab CI/CD, ArgoCD More ❯
Excellent communication skills. Familiarity with Python and its data, numerical, and machine learning libraries. It would be great if you also had: Experience with Hadoop and Jenkins. Azure and AWS certifications. Familiarity with Java. What we do for you: At Leidos, we are passionate about customer success, united as More ❯
technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Understanding of Maven or Gradle About the Team J.P. Morgan is a global leader in financial services, providing strategic advice More ❯
data modeling, warehousing, and building ETL pipelines - Experience with SQL - Experience mentoring team members on best practices - Experience with big data technologies such as Hadoop, Hive, Spark, EMR - Experience operating large data warehouses Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central More ❯
Columbia, Maryland, United States Hybrid / WFH Options
SRC
and ability to travel up to 25% of the time to customer sites located in Hawaii. Preferred Requirements Experience with big data technologies like Hadoop, Spark, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers (EKS, Diode), CI/CD, and Terraform. Willingness to perform some on More ❯
or related field plus 5 years of related experience. Prior experience must include: Build a big data engineering framework, using Opensource platforms like Spark& Hadoop; Work on tools, including SQL and AWS public cloud-services; Utilize Agile Software Development methodologies; Utilize Data Analytics on complex data structures and using More ❯
per week in Columbia, MD. Flexibility is key to accommodate any schedules changes per the customer. Preferred Requirements Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus Work could possibly require some on-call More ❯
Alexandria, Virginia, United States Hybrid / WFH Options
Metronome LLC
willing/able to help open/close the workspace during regular business hours as needed Desired Skills Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes are a plus All candidates will be required to be More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
offers experience in the following required task areas: experience with machine learning, statistical modeling, time-series forecasting, and/or geospatial analytics; experience with Hadoop, Spark, or other parallel storage/computing processes is a plus. Experience supporting cyber mission Overview Noblis and our wholly owned subsidiaries, Noblis ESI More ❯
Work with relational databases such as MySQL and MariaDB; use tools like PGAdmin for data management • Engage with big data tools such as ApacheHadoop, Kafka, HDFS, HBase, and Zookeeper Required Skills: • Strong experience with Java and Apache NiFi is required • Proven ability to develop and support complex backend More ❯
web framework for the backends and React for developing the client facing portion of the application Create, extract, transform, and load (ETL) pipelines using Hadoop and Apache Airflow for various production big data sources to fulfill intelligence data availability requirements Automate retrieval of data from various sources via API More ❯
of experience in data engineering or related work. -Proficiency in Java, AWS, Python, Apache Spark, Linux, Git, Maven, and Docker. -Experience maintaining an ApacheHadoop Ecosystem using tools like HBase, MapReduce, and Spark. -Knowledge of ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow. -Experience with AWS More ❯
Service Desk, etc.) Provide end user case support as needed Provide dataset analysis, development, error handling, version upgrades, and API enhancement Maintaining the ApacheHadoop Ecosystem, especially u?lizing HBase, MapReduce, and Spark. ETL processes u?lizing Linux shell scrip?ng, Perl, Python, and Apache Airflow. AWS services such More ❯
Domain Awareness, Satellite Conjunction Analysis, SIGINT, IMINT, or GEOINT Experience with Infrastructure-as-code tools (Ansible, Terraform) Experience with Big Data technologies (Spark, Cassandra, Hadoop) Experience with Java, Python, or Scala Experience with Docker, Kubernetes, BitBucket, GIT More ❯
delivering projects with a commercial mindset. Prior experience with Event Sourcing (Kafka, Akka, Spark) and Data Distribution based architecture Experience with NoSQL (Mongo, Elastic, Hadoop), in memory (MEMSQL, Ignite) and relational (Sybase, DB2, SybaseIQ) data store solutions. Strong knowledge of data structures, algorithms, and design patterns Experience in data More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Codescratch LLC
of AGILE software development methodologies and use of standard software development tool suites. Preferred Skills and Experience: Experience with Docker and Kubernetes Experience with Hadoop Experience with Spark Experience with Accumulo Experience monitoring application performance with metrics (Prometheus, InfluxDB, Grafana) and logs with ELK Stack (ElsticSearch, Logstash, Kibana) Experience More ❯
background in agile delivery and effort estimation Familiarity with Python and data libraries Excellent communication and problem-solving skills Nice to Have: Experience with Hadoop, Jenkins Cloud certifications (Azure or AWS) Basic knowledge of Java This is a 6 months rolling contract with daily rate up to £500. Sponsorship More ❯
Pig is highly desired • Experience with Data Science • Experience with Graphic Algorithms • Experience with Machine Learning • Experience with AWS • Cloud development experience such as Hadoop, Big Data (Cloudbase/Accumulo and Big Table) as well as JSON/BSON • Experience with analytic development • Experience with Python and streaming capabilities More ❯