industry leader by exploring innovative technologies, languages, and techniques in the rapidly evolving world. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux) Amazon : EMR, Step Functions, SQS, Lamda and AWS cloud-native architectures DevOps Tools : Terraform or Cloud Formation more »
industry leader by exploring innovative technologies, languages, and techniques in the rapidly evolving world. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux) Amazon : EMR, Step Functions, SQS, Lamda and AWS cloud-native architectures DevOps Tools : Terraform or Cloud Formation more »
industry leader by exploring innovative technologies, languages, and techniques in the rapidly evolving world. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux) Amazon : EMR, Step Functions, SQS, Lamda and AWS cloud-native architectures DevOps Tools : Terraform or Cloud Formation more »
industry leader by exploring innovative technologies, languages, and techniques in the rapidly evolving world. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux) Amazon : EMR, Step Functions, SQS, Lamda and AWS cloud-native architectures DevOps Tools : Terraform or Cloud Formation more »
industry leader by exploring innovative technologies, languages, and techniques in the rapidly evolving world. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux) Amazon : EMR, Step Functions, SQS, Lamda and AWS cloud-native architectures DevOps Tools : Terraform or Cloud Formation more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
field (STEM) Technical proficiency in cloud-based data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with Apache Airflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google Cloud more »
management and data governance open source platform that we will teach you. Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You’ll be involved in building the next generation of finance systems more »
Terraform/Docker/Kubernetes. Write software using either Java/Scala/Python . The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Database design concepts. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »
Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop, Kafka. Experience developing and deploying Machine Learning solutions on cloud platforms (e.g., AWS, Azure, or GCP). AWS Preferred. Experience containerizing analytical more »