and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
and design to implementation and support Experience with designing and implementing high volume data processing jobs is a plus Working knowledge of Spark in Hadoop is a plus Strong database development skills, including advanced SQL, relational and NoSQL database technologies is a plus Experience with AWS technologies are preferred more »
software engineer in a globally distributed team working with Scala, Java programming language (preferably both) Experience with big-data technologies Spark/Databricks and Hadoop/ADLS is a must Experience in any one of the cloud platform Azure (Preferred), AWS or Google Experience building data lakes and data more »
data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in this role, please send your CV and more »
limited to: Backend technology, Python. Databases like MSSQL. Front-end technology, Java. Cloud platform, AWS. Programming language, JavaScript (React.js) Big data technologies such as Hadoop, Spark, or Kafka. What We Need from You: Essential Skills: A degree in Computer Science, Engineering, or a related field, or equivalent experience. Proficiency more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
of Python and SQL Exposure to developing in a cloud platform such as AWS, GCP or Azure Knowledge of big data technologies e.g., Trino, Hadoop or Pyspark Ability to build trusted and credible relationships with your peers, stakeholders, and customers. Analytical thinker and natural problem solver If this sounds more »
relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases) • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR, Snowflake, and Data Mesh principles • Team player • Proactive and resilient • A passion for social good Our Mission Statement: We are an more »
capabilities such as GitHub, Jenkins and UrbanCode methodologies, demonstrating engineering excellence and a passion for automation Additionally, data domain technology experience including: Teradata, Oracle, Hadoop, DB2 and familiarity with NoSQL databases such as Cassandra and HBase Expertise in Cloud Native technologies including networking & security is a plus Understanding how more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object more »
automation & configuration management; Ansible (plus Puppet, Saltstack), Terraform, CloudFormation; NodeJS, REACT/MaterialUI (plus Angular), Python, JavaScript; Big data processing and analysis, e.g. ApacheHadoop (CDH), Apache Spark; RedHat Enterprise Linux, CentOS, Debian or Ubuntu. Java 8, Spring framework (preferably Spring boot), AMQP RabbitMQ, Open source technologies; Experience of more »
NumPy, Spark). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with distributed computing platforms (e.g., Hadoop, Apache Kafka). Familiarity with cloud computing services (e.g., AWS, GCP, Azure). Knowledge of financial markets and trading concepts. Previous exposure to DevOps more »
transparency and knowledge sharing across the organization. Nice to have skills: Dynamics 365 or Azure Data Services. Experience with big data technologies such as Hadoop, Spark, or Kafka. Knowledge of programming languages such as Python, R, or Scala. Experience with data visualization tools such as Power BI, Tableau, or more »