. Experience with ETL/ELT tools, APIs, and integration platforms. Deep knowledge of data modelling, warehousing, and real time analytics. Familiarity with big data technologies principals (e.g., Spark, Hadoop) and BI tools (e.g., Power BI, Tableau). Strong programming skills (e.g. SQL, Python, Java, or similar languages). Ability to exercise a substantial degree of independent professional responsibility More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
data points per day and create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST, Spring … tech stacks SKILLS AND EXPERIENCE WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas More ❯
automated testing and deployments using CI/CD pipelines. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of More ❯
to line manage small team of junior data engineers. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of More ❯
building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people More ❯
building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people More ❯
Experience with cloud technologies such as AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts and tools like Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). About Goldman Sachs At Goldman Sachs, we dedicate our people More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Ripjar
and data science libraries such as PyTorch, scikit-learn, numpy and scipy Good communication and interpersonal skills Experience working with large-scale data processing systems such as Spark and Hadoop Experience in software development in agile environments and an understanding of the software development lifecycle Experience using or implementing ML Operations approaches is valuable Working knowledge of statistics and More ❯
Finance, or similar. Knowledge of modeling techniques (logit, GLM, time series, decision trees, random forests, clustering), statistical programming languages (SAS, R, Python, Matlab), and big data tools and platforms (Hadoop, Hive, etc.). Solid academic record. Strong computer skills. Knowledge of other languages is desirable. Proactive attitude, maturity, responsibility, and strong work ethic. Quick learner. Ability to integrate into More ❯