ELT tools, APIs, and integration platforms. Deep knowledge of data modelling, warehousing, and real time analytics. Familiarity with big data technologies principals (e.g., Spark, Hadoop) and BI tools (e.g., Power BI, Tableau). Strong programming skills (e.g. SQL, Python, Java, or similar languages). Ability to exercise a substantial More ❯
Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with More ❯
CI/CD pipelines. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and More ❯
of junior data engineers. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and More ❯
platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman More ❯
platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman More ❯
as AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts and tools like Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). About Goldman Sachs At Goldman More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Ripjar
as PyTorch, scikit-learn, numpy and scipy Good communication and interpersonal skills Experience working with large-scale data processing systems such as Spark and Hadoop Experience in software development in agile environments and an understanding of the software development lifecycle Experience using or implementing ML Operations approaches is valuable More ❯
modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab), and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Proactive attitude, maturity, responsibility, and strong work ethic. Quick More ❯
create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE … WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs, we commit More ❯