build solutions that offer best in class experience to our risk managers and analysts. Our technology stack also includes Sybase IQ, MongoDb, Presto on Hadoop clusters, Elastic Search, Kafka, Angular, React, Elastic Search and Tableau. We do extensive domain modelling using the firm-developed open-source platform called Alloy. more »
would be an advantage Data visualization – Tools like Tableau Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL more »
Technical Discipline. Technical Expertise: Proficiency in SQL and experience with cloud-based data pipelines (Azure, AWS, GCP). Familiarity with big data tools like Hadoop and Spark. Data Management Skills: Hands-on experience working with large data sets, data pipelines, workflow management tools, and Azure cloud services. Exposure to more »
learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design/development. Regulatory Awareness/Compliance Uphold Regulatory/Compliance requirements relevant to your role escalating areas more »
experience, Experience with cloud computing platforms such as AWS, Azure, or GCP (Google Cloud Platform). Familiarity with big data technologies such as ApacheHadoop, Spark, or Kafka. Experience deploying machine learning models in production environments. Contributions to open-source machine learning projects or research publications in relevant conferences more »
software engineer in a globally distributed team working with Scala, Java programming language (preferably both) Experience with big-data technologies Spark/Databricks and Hadoop/ADLS is a must Experience in any one of the cloud platform Azure (Preferred), AWS or Google Experience building data lakes and data more »
data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in this role, please send your CV and more »
limited to: Backend technology, Python. Databases like MSSQL. Front-end technology, Java. Cloud platform, AWS. Programming language, JavaScript (React.js) Big data technologies such as Hadoop, Spark, or Kafka. What We Need from You: Essential Skills: A degree in Computer Science, Engineering, or a related field, or equivalent experience. Proficiency more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases) • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR, Snowflake, and Data Mesh principles • Team player • Proactive and resilient • A passion for social good Our Mission Statement: We are an more »
capabilities such as GitHub, Jenkins and UrbanCode methodologies, demonstrating engineering excellence and a passion for automation Additionally, data domain technology experience including: Teradata, Oracle, Hadoop, DB2 and familiarity with NoSQL databases such as Cassandra and HBase Expertise in Cloud Native technologies including networking & security is a plus Understanding how more »
automation & configuration management; Ansible (plus Puppet, Saltstack), Terraform, CloudFormation; NodeJS, REACT/MaterialUI (plus Angular), Python, JavaScript; Big data processing and analysis, e.g. ApacheHadoop (CDH), Apache Spark; RedHat Enterprise Linux, CentOS, Debian or Ubuntu. Java 8, Spring framework (preferably Spring boot), AMQP RabbitMQ, Open source technologies; Experience of more »
NumPy, Spark). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with distributed computing platforms (e.g., Hadoop, Apache Kafka). Familiarity with cloud computing services (e.g., AWS, GCP, Azure). Knowledge of financial markets and trading concepts. Previous exposure to DevOps more »
experience in ETL technical design, automated data quality testing, QA, documentation, data warehousing, data modelling, and data wrangling. Proficiency in RDMS, ETL pipelines, Python, Hadoop, SQL, and a solid grasp of modern code development practices. Ability to manage multiple data and analytic systems with an awareness of decentralised data more »
/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
Bitbucket, Maven, Helm Instrumentation and monitoring: Splunk, Prometheus, Grafana, Elk Languages and frameworks: Bash, Java, Groovy, Go, Python, React Big data technologies: Cassandra, ArangoDB, Hadoop, Kafka, MongoDB, minIO, Ceph About You: You have an automation-first mindset. You enjoy sharing your knowledge and have a thirst for expanding your more »
the following platforms: MySQL or Cassandra. Experience of developing and deploying applications into AWS or a private cloud. Exposure to any of the following: Hadoop, JMS, Zookeeper, Spring, JavaScript, UI Development. Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
Bedford, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
PyTorch etc.) MLOps experience Nice to have: Familiarity with Git or other Version Control Systems Computer Vision Library exposure Understanding of Big Data Technologies (Hadoop, Spark etc) Experience with Cloud platforms (AWS, GCP or Azure) This is a fully remote role, but may require very occasional travel (once a more »
Northampton, Northamptonshire, East Midlands, United Kingdom Hybrid / WFH Options
Dupen Ltd
Linux, APIs, infrastructure design load balancing, VMs, PostgreSQL, vector dbs. Senior ML Learning Engineer desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud AWS, Google Cloud, Azure, and a knowledge of secure coding techniques PCI-DSS, PA-DSS, ISO27001. This is a fantastic opportunity to more »
table and be open to expanding your skills further. With industrial experience in AWS/GCP/Azure and familiarity with data products like Hadoop, Spark, and PostgreSQL, you'll thrive in our data-driven environment. Your problem-solving skills and meticulous attention to detail will make you a more »
a sense of trust with stakeholders. Preferred qualifications, capabilities and skills Experience with deep learning frameworks (pytorch, tensorflow) Experience with big-data technologies (Spark, Hadoop) or distributed computation frameworks (Dask, Modin) Hands on experience with Natural Language Processing (NLP) and Large Language Models (LLMs) Experience of creating and deploying more »
Jenkins and UrbanCode methodologies, demonstrating engineering excellence and a passion for automation Additionally, data domain technology experience including: Database technology such as Teradata, Oracle, Hadoop, DB2 and familiarity with NoSQL databases such as Cassandra and HBase Expertise in Container & Virtualization/Hypervisor technologies such as K8,firecracker/gVisor more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Ripjar
the nuances of dealing with structured and unstructured data, and be experienced in using databases (Mongo ideally) Experience with Linux Experience with Spark (Pyspark), Hadoop or other Big data technologies would be beneficial, but not required Benefits Why we think you'll enjoy it here: Base Salary of up more »