code review). * Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
Qualifications A minimum of 5 years in data engineering , with expertise in scalable architectures such as Data Lakes, Graph, and Vector Databases (e.g., ADLS, neo4j, Elasticsearch). Proficiency in developing data pipelines across diverse environments, leveraging Azure and other modern technologies. Proven ability to orchestrate complex data workflows and more »
Athena, Redshift, Glue, EMR) Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
Java; experience with Machine Learning libraries and frameworks. Experience with common data science tools such as Python, R, PyTorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modeling platforms such as SageMaker, Databricks, and Dataiku. Experience with data management technologies such as Databricks, Apache Spark, Hadoop more »
you have experience in Hadoop ecosystem (Spark, Kafka, HDFS, Hive, HBase, …), Docker and orchestration platform (Kubernetes, Openshift, AKS, GKE...), and noSQL Databases (MongoDB, Cassandra, Neo4j) • Any experience with cloud platforms such as AWS, Azure and Google Cloud is a real asset. • You have experience in working closely with business more »
NLP algorithms, including transformers, graphical models, and information retrieval techniques Python and scientific computing packages (pytorch, numpy, scikit-learn, tensorflow) Database technologies including ElasticSearch, Neo4j, and SQL Excellent interpersonal, verbal, and written communication skills A flexible attitude with respect to work assignments and new learning Ability to manage multiple more »
3.8+, as well as experience with Dotnet 6, HTML, JavaScript and Unit testing. Experience in any of the following is desirable but not essential: Neo4J and Graph Database concepts; SQL Server relational databases; Kafka, Kettle; Azure DevOps Source Control; GIT Source Control; Angular 10 or greater; Using Docker containers more »
and managing data storage. Requirements Strong understanding of AWS, GCP, Azure, or Terraform. Good understanding of database fundamentals. Prior or some familiarity with PostgreSQL, Neo4j, RabbitMQ, Celery and Redis. Prior use of Python, C/C++, and Windows build environments is beneficial but not required. You can communicate effectively more »