AWS, Azure, GCP). Strong communication and stakeholder engagement skills - translating complex technical concepts into business outcomes. Bonus Points For Experience with semantic knowledge graphs , RDF/SPARQL , and ontology/taxonomy modeling . Familiarity with metadata-driven AI/ML enrichment . Knowledge of financial mathematics or capital markets. A "can-do" mindset and hunger for continuous learning. Why More ❯
AWS, Azure, GCP ). Strong communication and stakeholder engagement skills — translating complex technical concepts into business outcomes. 💡 Bonus Points For Experience with semantic knowledge graphs , RDF/SPARQL , and ontology/taxonomy modeling . Familiarity with metadata-driven AI/ML enrichment . Knowledge of financial mathematics or capital markets. A “can-do” mindset and hunger for continuous learning. 🌍 Why More ❯
AWS, Azure, GCP ). Strong communication and stakeholder engagement skills — translating complex technical concepts into business outcomes. 💡 Bonus Points For Experience with semantic knowledge graphs , RDF/SPARQL , and ontology/taxonomy modeling . Familiarity with metadata-driven AI/ML enrichment . Knowledge of financial mathematics or capital markets. A “can-do” mindset and hunger for continuous learning. 🌍 Why More ❯
AWS, Azure, GCP). Strong communication and stakeholder engagement skills — translating complex technical concepts into business outcomes. 💡 Bonus Points For Experience with semantic knowledge graphs , RDF/SPARQL , and ontology/taxonomy modeling . Familiarity with metadata-driven AI/ML enrichment . Knowledge of financial mathematics or capital markets. A “can-do” mindset and hunger for continuous learning. 🌍 Why More ❯
AWS, Azure, GCP). Strong communication and stakeholder engagement skills — translating complex technical concepts into business outcomes. 💡 Bonus Points For Experience with semantic knowledge graphs , RDF/SPARQL , and ontology/taxonomy modeling . Familiarity with metadata-driven AI/ML enrichment . Knowledge of financial mathematics or capital markets. A “can-do” mindset and hunger for continuous learning. 🌍 Why More ❯
london (city of london), south east england, united kingdom
Luxoft
AWS, Azure, GCP). Strong communication and stakeholder engagement skills — translating complex technical concepts into business outcomes. 💡 Bonus Points For Experience with semantic knowledge graphs , RDF/SPARQL , and ontology/taxonomy modeling . Familiarity with metadata-driven AI/ML enrichment . Knowledge of financial mathematics or capital markets. A “can-do” mindset and hunger for continuous learning. 🌍 Why More ❯
Degree in Life Sciences, Data Science, Computer Science, or a related field. 2+ years' experience in data curation, data management or data governance. Working knowledge of data standards, models, ontologies and taxonomies. Experience with tools such as Semaphore, Collibra, Protégé, or similar. Strong communication and stakeholder-management skills. More ❯
/Principal levels) Extensive knowledge of Drug Discovery, Development/Manufacturing or similar Life Science domain Fluency in Python, Data Modelling, Engineering, Analysis and Visualisation (tabular & JSON, SQL, NoSQL, Ontologies, Streamlit, Plotly, Holoviews) A track record of architecting productionised scientific solutions , integrated with AI/ML and APIs for Biopharma end users Strong communication skills to engage across leadership, scientific More ❯
/Principal levels) Extensive knowledge of Drug Discovery, Development/Manufacturing or similar Life Science domain Fluency in Python, Data Modelling, Engineering, Analysis and Visualisation (tabular & JSON, SQL, NoSQL, Ontologies, Streamlit, Plotly, Holoviews) A track record of architecting productionised scientific solutions , integrated with AI/ML and APIs for Biopharma end users Strong communication skills to engage across leadership, scientific More ❯
BSc, MSc, PhD) Experience with ML/NLP frameworks (e.g., PyTorch, TensorFlow, HuggingFace, Scikit-learn) Strong Python skills and familiarity with additional languages (e.g., Java, C++) Understanding of biomedical ontologies, knowledge graphs, or causal inference is a plus Familiarity with cloud platforms (AWS, Azure, GCP) and Linux environments Bonus Experience: Prior work in biomedical NLP, literature mining, or clinical informatics More ❯
BSc, MSc, PhD) Experience with ML/NLP frameworks (e.g., PyTorch, TensorFlow, HuggingFace, Scikit-learn) Strong Python skills and familiarity with additional languages (e.g., Java, C++) Understanding of biomedical ontologies, knowledge graphs, or causal inference is a plus Familiarity with cloud platforms (AWS, Azure, GCP) and Linux environments Bonus Experience: Prior work in biomedical NLP, literature mining, or clinical informatics More ❯
for automated enrichment and agentic features. Develop entity-matching algorithms (potentially using ML) to link disparate data points and resolve entities. Work with domain experts to formalise a comprehensive ontology of the chemical and energy supply chain. Build agent-based systems that perform complex automated tasks, updating the digital twin based on real-time data. Establish the foundations for MLOps More ❯
for automated enrichment and agentic features. Develop entity-matching algorithms (potentially using ML) to link disparate data points and resolve entities. Work with domain experts to formalise a comprehensive ontology of the chemical and energy supply chain. Build agent-based systems that perform complex automated tasks, updating the digital twin based on real-time data. Establish the foundations for MLOps More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Higher - AI recruitment
for automated enrichment and agentic features. Develop entity-matching algorithms (potentially using ML) to link disparate data points and resolve entities. Work with domain experts to formalise a comprehensive ontology of the chemical and energy supply chain. Build agent-based systems that perform complex automated tasks, updating the digital twin based on real-time data. Establish the foundations for MLOps More ❯
london, south east england, united kingdom Hybrid / WFH Options
Higher - AI recruitment
for automated enrichment and agentic features. Develop entity-matching algorithms (potentially using ML) to link disparate data points and resolve entities. Work with domain experts to formalise a comprehensive ontology of the chemical and energy supply chain. Build agent-based systems that perform complex automated tasks, updating the digital twin based on real-time data. Establish the foundations for MLOps More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Higher - AI recruitment
for automated enrichment and agentic features. Develop entity-matching algorithms (potentially using ML) to link disparate data points and resolve entities. Work with domain experts to formalise a comprehensive ontology of the chemical and energy supply chain. Build agent-based systems that perform complex automated tasks, updating the digital twin based on real-time data. Establish the foundations for MLOps More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Higher - AI recruitment
for automated enrichment and agentic features. Develop entity-matching algorithms (potentially using ML) to link disparate data points and resolve entities. Work with domain experts to formalise a comprehensive ontology of the chemical and energy supply chain. Build agent-based systems that perform complex automated tasks, updating the digital twin based on real-time data. Establish the foundations for MLOps More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Immersum
data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a small, ambitious team to deliver end-to-end data solutions. Support More ❯
data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a small, ambitious team to deliver end-to-end data solutions. Support More ❯
data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a small, ambitious team to deliver end-to-end data solutions. Support More ❯
london, south east england, united kingdom Hybrid / WFH Options
Immersum
data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a small, ambitious team to deliver end-to-end data solutions. Support More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Immersum
data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a small, ambitious team to deliver end-to-end data solutions. Support More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Immersum
data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a small, ambitious team to deliver end-to-end data solutions. Support More ❯
Deployed Engineer, Palantir FDE Strong programming skills in Python , SQL , and optionally Java/Scala Hands-on experience with Palantir Foundry tools , including: Code Repositories, Pipeline Builder, Code Workbook Ontology Management, Contour, Solution Designer, Data Lineage Data Health, Data Connections, Egress Policies Experience in end-to-end solution development , from planning to scaling Strong understanding of data architecture , including data More ❯
non-technical stakeholders, facilitating discussions with multidisciplinary teams, including military users. Data Integration – build clean, reliable, and compliant pipelines in Foundry using Pipeline Builder and Python. Design and manage ontologies to structure data effectively, leveraging Link Types and Actions for optimal use in Foundry and Workshop. Application Development – create full-stack applications with Foundry tools to deliver intuitive interfaces and More ❯