t achieve in the cloud. BASIC QUALIFICATIONS - Strong software development skills in multiple languages (Java, Python, R) with extensive ETL and data management experience - Deep understanding of Hadoop ecosystem (Sqoop, Kafka, etc.) and SQL-on-Hadoop technologies (Hive, Spark SQL) with proven experience in MPP models - Demonstrated expertise in GenAI, including LLMs, RAG architectures, vector databases, and production deployment experience More ❯
Columbia, South Carolina, United States Hybrid / WFH Options
Systemtec Inc
and Machine Learning Amazon Bedrock, AWS Sagemaker, Unified Studio, R Studio/Posit Workbench, R Shiny/Posit Connect, Posit Package Manager, AWS Data Firehose, Kafka, Hive, Hue, Oozie, Sqoop, Git/Git Actions, IntelliJ, Scala Responsibilities of the Data Engineer (AWS): Act as an internal consultant, advocate, mentor, and change agent providing expertise and technical guidance on complex projects. More ❯
/or Data Warehouse. - Experience in leading discovery and design workshops including estimating, scoping and delivering customer proposals aligned with Analytics Solutions - Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro, Parquet, Iceberg, Hudi) - Experience developing software and data engineering code in one or more programming languages (Java, Python, PySpark, Node, etc) - AWS More ❯
Continuous Integration, Cloud technologies, Virtualisation Tools, Monitoring utilities, Disaster recovery process/tools Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, Spark Streaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware More ❯