Job Description: Scala/Spark • Good Big Data resource with the below Skillset: Java Big data technologies. • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data processing platform (Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written and verbal communication • A history of delivering against agreed objectives More ❯
Job Description: Scala/Spark • Good Big Data resource with the below Skillset: Java Big data technologies. • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data processing platform (Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written and verbal communication • A history of delivering against agreed objectives More ❯
data platform (CDP) and CDP Services and Big data knowledge. Proficiency in Terraform for infrastructure as code (IaC). Strong hands-on experience with Cloudera CDP and Hadoop ecosystem (Hive, Impala, HDFS, etc.) Experience with GitHub Actions or similar CI/CD tools (e.g., Jenkins, GitLab CI). Solid scripting skills in Shell and Python. Extensive experience in designing More ❯
data platform (CDP) and CDP Services and Big data knowledge. Proficiency in Terraform for infrastructure as code (IaC). Strong hands-on experience with Cloudera CDP and Hadoop ecosystem (Hive, Impala, HDFS, etc.) Experience with GitHub Actions or similar CI/CD tools (e.g., Jenkins, GitLab CI). Solid scripting skills in Shell and Python. Extensive experience in designing More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
and cloud-based data platforms (AWS, Azure, or GCP). Proven track record in credit risk modelling, fraud analytics, or similar financial domains. Familiarity with big data technologies (Spark, Hive) and MLOps practices for production-scale deployments. Excellent communication skills to engage stakeholders and simplify complex concepts. Desirable Extras Experience with regulatory frameworks (e.g., Basel, GDPR) and model explainability More ❯
and cloud-based data platforms (AWS, Azure, or GCP). Proven track record in credit risk modelling, fraud analytics, or similar financial domains. Familiarity with big data technologies (Spark, Hive) and MLOps practices for production-scale deployments. Excellent communication skills to engage stakeholders and simplify complex concepts. Desirable Extras Experience with regulatory frameworks (e.g., Basel, GDPR) and model explainability More ❯
The team's expertise spans a wide range of technologies, including Java and Python based MicroServices, Data Platform services, AWS/GCP cloud backend systems, Big Data technologies like Hive and Spark, and modern Web applications. With a globally distributed presence across the US, India and Europe, the team thrives on collaboration, bringing together diverse perspectives to solve complex … demonstrate great communication skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/ More ❯
working within a fast-paced financial services environment. Key Responsibilities: Design, develop, and maintain applications using Scala, Python, Hadoop and Java . Work with Big Data technologies , including Spark, Hive (nice to have). Collaborate with cross-functional teams to deliver scalable, high-performance solutions. Participate in code reviews, testing, and performance optimization. Ensure best practices in coding, design … and architecture. Skills & Experience Required: 3-8 years of software development experience. Strong hands-on expertise in Scala (mandatory) , plus Python and Java . Experience with Big Data frameworks ; Apache Spark experience is an advantage. Solid understanding of software engineering principles, data structures, and algorithms. Strong problem-solving skills and ability to work in an Agile environment. Educational Criteria More ❯
working within a fast-paced financial services environment. Key Responsibilities: Design, develop, and maintain applications using Scala, Python, Hadoop and Java . Work with Big Data technologies , including Spark, Hive (nice to have). Collaborate with cross-functional teams to deliver scalable, high-performance solutions. Participate in code reviews, testing, and performance optimization. Ensure best practices in coding, design … and architecture. Skills & Experience Required: 2-6 years of software development experience. Strong hands-on expertise in Scala (mandatory) , plus Python and Java . Experience with Big Data frameworks ; Apache Spark experience is an advantage. Solid understanding of software engineering principles, data structures, and algorithms. Strong problem-solving skills and ability to work in an Agile environment. Educational Criteria More ❯
Role: Staff AI/ML Scientist – Search, RecSys, Personalization & GenAI (Dubai based) Join a high-performing Data Science team whose mission is to drive competitive value through scalable AI solutions. The team builds models that enhance user experiences, enable better More ❯
Role: Staff AI/ML Scientist – Search, RecSys, Personalization & GenAI (Dubai based) Join a high-performing Data Science team whose mission is to drive competitive value through scalable AI solutions. The team builds models that enhance user experiences, enable better More ❯