build solutions that offer best in class experience to our risk managers and analysts. Our technology stack also includes Sybase IQ, MongoDb, Presto on Hadoop clusters, Elastic Search, Kafka, Angular, React, Elastic Search and Tableau. We do extensive domain modelling using the firm-developed open-source platform called Alloy. more »
would be an advantage Data visualization – Tools like Tableau Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL more »
learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design/development. Regulatory Awareness/Compliance Uphold Regulatory/Compliance requirements relevant to your role escalating areas more »
experience, Experience with cloud computing platforms such as AWS, Azure, or GCP (Google Cloud Platform). Familiarity with big data technologies such as ApacheHadoop, Spark, or Kafka. Experience deploying machine learning models in production environments. Contributions to open-source machine learning projects or research publications in relevant conferences more »
software engineer in a globally distributed team working with Scala, Java programming language (preferably both) Experience with big-data technologies Spark/Databricks and Hadoop/ADLS is a must Experience in any one of the cloud platform Azure (Preferred), AWS or Google Experience building data lakes and data more »
data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in this role, please send your CV and more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases) • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR, Snowflake, and Data Mesh principles • Team player • Proactive and resilient • A passion for social good Our Mission Statement: We are an more »
NumPy, Spark). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with distributed computing platforms (e.g., Hadoop, Apache Kafka). Familiarity with cloud computing services (e.g., AWS, GCP, Azure). Knowledge of financial markets and trading concepts. Previous exposure to DevOps more »
experience in ETL technical design, automated data quality testing, QA, documentation, data warehousing, data modelling, and data wrangling. Proficiency in RDMS, ETL pipelines, Python, Hadoop, SQL, and a solid grasp of modern code development practices. Ability to manage multiple data and analytic systems with an awareness of decentralised data more »
a day of applied data science classroom training Involvement with diverse Big Data tools such as Spark/Kafka/NiFi/HDFS/Hadoop/Docker/ELK/AWS Great work environment Assignments to enhance your classroom learning. Personalized coaching from our SMEs Real Life Scenarios Industry more »
/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
a sense of trust with stakeholders. Preferred qualifications, capabilities and skills Experience with deep learning frameworks (pytorch, tensorflow) Experience with big-data technologies (Spark, Hadoop) or distributed computation frameworks (Dask, Modin) Hands on experience with Natural Language Processing (NLP) and Large Language Models (LLMs) Experience of creating and deploying more »
Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
essential: -Proven experience as an Architect and excellent knowledge of Big Data -Great understanding of Cloud e.g. Azure and or AWS -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
The following skills/experience is essential: -Proven experience as a Lead Big Data Engineer with excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellence experience across Azure -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
within a typical retail trading environment is key. Experience required: A background in leveraging hands on skills using tools such as Python, R, Spark, Hadoop, SQL and cloud based platforms such as GCP, Azure and AWS to manipulate and analyse various data sets in large volumes Background in data more »
scalable systems. Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, Ray, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Experience with CI/CD Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent more »
City of London, London, United Kingdom Hybrid / WFH Options
BeTechnology Group
scalable systems. Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, Ray, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Experience with CI/CD Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent more »
East London, London, United Kingdom Hybrid / WFH Options
Be Technology
scalable systems. Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, Ray, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Experience with CI/CD Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent more »
scalability, and extensibility. The following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large more »