e.g., Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., TensorFlow, MXNet, scikit-learn) Software engineering practices (coding standards, unit testing, version control, code review) Hadoop distributions (Cloudera, Hortonworks), NoSQL databases (Neo4j, Elastic), streaming technologies (Spark Streaming) Data manipulation and wrangling techniques Development and deployment technologies (virtualisation, CI tools like Jenkins, configuration management with Ansible, containerisation with Docker, Kubernetes) Data visualization More ❯
to AI/ML workflows. • Database Management (SQL & NoSQL): Practical understanding of SQL and NoSQL database technologies, including relational (e.g., PostgreSQL, MySQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra, Neo4j). Experience with projects involving data modeling, query optimization, and database administration. Familiarity with graph databases (e.g., Neo4j) and search/analytics platforms like the ELK stack is More ❯
dynamics, following business strategy frameworks (SWOT, Porter's Five Forces, BCG Matrix, etc.) Create relationship taxonomies that capture complex strategic dependencies into formal knowledge structures Implement ontology schemas in Neo4j or similar graph database systems. Create graph algorithms and queries to identify strategic patterns and insights from data Build data pipelines to extract, transform, and load strategic data from … Computer Science, Data Science, Information Science, or related field 4-5 years of hands-on experience in Machine Learning, Data Science or Software Development Experience with graph database technologies (Neo4j preferred) Strong programming skills in Python Demonstrated interest in knowledge representation, ontologies, or semantic technologies Familiarity with large language models and prompt engineering Ability to translate conceptual frameworks into … or ontology projects Background in semantic web technologies or linked data principles Technical Skills Programming Languages : Proficiency with Python is mandatory. Knowledge of JavaScript is also beneficial. Graph Technologies : Neo4j, Cypher, GraphQL Data Engineering : ETL pipelines, data integration patterns Machine Learning : NLP, embedding models, text classification LLM Integration : Prompt engineering, context management Visualization : Graph visualization tools and techniques Containerization More ❯
to AI/ML workflows. • Database Management (SQL & NoSQL): Practical understanding of SQL and NoSQL database technologies, including relational (e.g., PostgreSQL, MySQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra, Neo4j). Experience with projects involving data modeling, query optimization, and database administration. Familiarity with graph databases (e.g., Neo4j) and search/analytics platforms like the ELK stack is More ❯
Databricks and Snowflake for data engineering and analytics Experience working with Big Data technologies (e.g., Hadoop, Apache Spark) Familiarity with NoSQL databases (e.g., columnar or graph databases like Cassandra, Neo4j) Research experience with peer-reviewed publications Certifications in cloud-based machine learning services (AWS, Azure, GCP) What`s in it for you? Care: your mental and physical health is More ❯
About Neo4j: Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along … with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we're proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by … and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it's accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j's cloud offering More ❯
Factory). Experience with DevOps principles (CI/CD, Git, automated testing). Familiarity with data governance tools (e.g., Data Lineage, Data Quality). Experience with tools like Redis , Neo4j , or Apache Arrow . Power BI experience for data visualisation is a plus. What You Bring: Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to More ❯
Mc Lean, Virginia, United States Hybrid / WFH Options
MITRE
Google Cloud, or Microsoft Azure. • Experience applying various machine learning approaches (e.g., random forest, neural networks, support vector machines). • Experience working with databases (e.g., PostgreSQL, Oracle, MySQL, MongoDB, Neo4J). • Experience using version control (e.g., Git, Mercurial, SVN) to support collaborative development. • Experience utilizing notebooks (e.g., Jupyter, R Markdown, Zeppelin). • Experience developing interactive data visualizations using open More ❯
desirable): Knowledge of streaming services – Flink, Kafka Knowledge of Dimensional Modelling Knowledge of No Sql dbs (dynamo db, Cassandra) Knowledge of node based architecture, graph databases and languages – Neptune, Neo4j, Gremlin, Cypher Experience 5+ years of experience with Databricks, Spark, Scala,PySpark, Python 5+ years of experience in SQL and database technologies like Snowflake or equivalent. 3+ year of More ❯
plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and code maintainability Experience deploying applications into production environments, including packaging, monitoring, and release management More ❯
React, or Vue.js. Familiarity with CI/CD principles and technologies, including experience with GitHub Actions or similar. Experience working with relational and NoSQL databases such as Postgres, Redis, Neo4j, Milvus, or MongoDB, and a good understanding of data consistency trade-offs. Proven knowledge of cloud platforms (e.g., AWS, Azure, or GCP). A Bonus: Experience with graph databases … such as Neo4j, Pinecone, or Milvus. Experience building native desktop apps. Experience with NLP libraries and frameworks, such as spaCy or Transformers. Familiarity with machine learning concepts and the ability to work with NLP datasets. Why Join Us: Join a pioneering joint venture at the intersection of AI and industry transformation. Work with a diverse and collaborative team of More ❯
London, England, United Kingdom Hybrid / WFH Options
European Bioinformatics Institute | EMBL-EBI
of research and development experience Strong background in data integration Experience in fast prototyping languages (e.g. Python) and Unix skills. Experience with linked data, RDF, MongoDB, Graph databases (e.g.: Neo4J), Triple store technologies. Relational databases and SQL (Oracle and PostgreSQL experience desirable) R, Python, Java, Javascript, and common libraries for data science and visualisation (Pandas, ggplot2 etc.) Familiarity with More ❯
Kubernetes deployment) in multiple environments (AWS, AZURE, GCP). Operationalization of ML solutions to production. •Experience in Microservices development, API backend development using FastAPI •Relational DB (SQL), Graph DB (Neo4j) and Vector DB (Pinecone, Weviate, Qdrant) •Guide team to debug issues with pipeline failures •Engage with Business/Stakeholders with status update on progress of development and issue fix … backend development using FastAPI •MLOps (model/component dockerization, Kubernetes deployment) in multiple environments (AWS, AZURE, GCP). Operationalization of AI solutions to production. •Relational DB (SQL), Graph DB (Neo4j) and Vector DB (Pinecone, Weviate, Qdrant) •Experience designing and implementing ML Systems & pipelines, MLOps practices •Exposure to event driven orchestration, Online Model deployment •Hands on experience in working with More ❯
backend development using FastAPI •MLOps (model/component dockerization, Kubernetes deployment) in multiple environments (AWS, AZURE, GCP). Operationalization of AI solutions to production. •Relational DB (SQL), Graph DB (Neo4j) and Vector DB (Pinecone, Weviate, Qdrant) •Guide team to debug issues with pipeline failures •Engage with Business/Stakeholders with status update on progress of development and issue fix … backend development using FastAPI •MLOps (model/component dockerization, Kubernetes deployment) in multiple environments (AWS, AZURE, GCP). Operationalization of AI solutions to production. •Relational DB (SQL), Graph DB (Neo4j) and Vector DB (Pinecone, Weviate, Qdrant) •Experience designing and implementing ML Systems & pipelines, MLOps practices •Exposure to event driven orchestration, Online Model deployment •Hands on experience in working with More ❯
London, England, United Kingdom Hybrid / WFH Options
Derisk360
Architect and develop end-to-end data pipelines on Google Cloud Platform (GCP), integrating structured, semi-structured, and unstructured data sources. Design and implement advanced Graph Database solutions using Neo4j, Cypher queries, and GCP-native integrations. Create ETL/ELT workflows leveraging GCP services including Dataflow, Pub/Sub, BigQuery, and Cloud Storage. Model real-world use cases in … Neo4j such as fraud detection, knowledge graphs, and network analysis. Optimize graph database performance, ensure query scalability, and maintain system efficiency. Manage ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Collaborate with cross-functional teams and clients across diverse EMEA time … zones and domains. What You Bring 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform. Proficiency in SQL, Python, and Cypher query language. Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Expertise in graph theory, graph schema modeling, and data relationship mapping. Bachelor’s degree in More ❯
to support advanced analytical use cases and ML, AI opportunities. Experience with containerisation technologies ( Docker, Kubernetes ) for scalable data solutions. Experience with vector databases and graph databases (e.g., Pinecone, Neo4j, AWS Neptune ). Understanding of data mesh-fabric approaches and modern data architecture patterns . Familiarity with AI/ML workflows and their data requirements. Experience with API specifications More ❯
to support advanced analytical use cases and ML, AI opportunities. Experience with containerisation technologies ( Docker, Kubernetes ) for scalable data solutions. Experience with vector databases and graph databases (e.g., Pinecone, Neo4j, AWS Neptune ). Understanding of data mesh-fabric approaches and modern data architecture patterns . Familiarity with AI/ML workflows and their data requirements. Experience with API specifications More ❯
of others, actively mentoring individuals, and helping build technical communities across the organization Skills & Experience Extensive experience of data modeling expertise Recent proficient experience within Snowflake, Knowledge Graphs including Neo4j, several RDS and NoSQL dbs in AWS, with a solid understanding of design principles, performance tuning and observability Proven track record in architecting and operating high availability, fault tolerant More ❯
business information into technical specifications Skills Required (desirable): Knowledge of streaming services - Flink, Kafka Familiarity of Dimensional Modelling concepts Knowledge of node based architecture, graph databases and languages - Neptune, Neo4j, Gremlin, Cypher Experience 8+ years of experience with Databricks product suite, Spark, Scala, PySpark, Python 8+ years of experience in SQL and database technologies like Snowflake or equivalent. 5+ More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
Allianz
intelligence would be advantageous. Familiarity with Azure cloud platform and distributed computing frameworks (e.g., Apache Spark) is a plus. Experience in building and working with knowledge graphs, such as Neo4j, or GraphDB would be desirable. What We Will Offer You Recognised and rewarded for a job well done, we have a range of flexible benefits for you to choose More ❯
ability to translate business information into technical specifications Knowledge of streaming services – Flink, Kafka Familiarity of Dimensional Modelling concepts Knowledge of node based architecture, graph databases and languages – Neptune, Neo4j, Gremlin, Cypher Experience 8+ years of experience with Databricks product suite, Spark, Scala, PySpark, Python 8+ years of experience in SQL and database technologies like Snowflake or equivalent. 5+ More ❯
pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization Familiarity with graph databases (e.g., Neo4j, Memgraph) or search platforms (e.g., Elasticsearch, OpenSearch) to support complex data relationships and querying needs Solid understanding of cloud infrastructure, particularly AWS, with practical experience using Docker, Kubernetes, and More ❯
pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization Familiarity with graph databases (e.g., Neo4j, Memgraph) or search platforms (e.g., Elasticsearch, OpenSearch) to support complex data relationships and querying needs Solid understanding of cloud infrastructure, particularly AWS, with practical experience using Docker, Kubernetes, and More ❯
experience as a Solution Architect or Data Architect in large-scale, enterprise environments.Rail industry experience or deep understanding of transportation infrastructure and operational data.Strong knowledge of graph databases (e.g., Neo4j, Amazon Neptune) and graph-based modelling for network analysis and relationship mapping.Understanding of network management systems, topology modelling, and asset interconnectivity in physical or digital rail infrastructure.Solid experience with More ❯