Knowledge of NoSQL databases (MongoDB, Cassandra, DynamoDB), message queueing systems (Kafka, RabbitMQ), and version control systems (Git). Preferred Skills: Experience with natural language processing libraries such as NLTK, spaCy, or Hugging Face Transformers. Familiarity with time-series databases and analysis tools. Knowledge of AI model serving frameworks like TensorFlow Serving or ONNX Runtime. Experience with AI ethics and bias More ❯
applications, especially in NLP or classification problems. Hands on experience with ML tools like TensorFlow, PyTorch etc. Experience with data science libraries such as NLTK, Scikit-learn, SciPy, (Sci)SpaCy etc. Excellent problem-solving and programming skills in Python Excellent communication skills Preferred Qualifications & Skills: If you have the following characteristics, it would be a plus: Master's or PhD More ❯
Machine learning experience : Familiarity with machine learning frameworks and libraries such as TensorFlow, PyTorch, or scikit-learn. Natural Language Processing (NLP) : Experience with NLP techniques and tools, such as spaCy or NLTK. Distributed systems : Knowledge of distributed systems and experience with tools like Kubernetes or Docker. Cloud services : Experience with cloud platforms like AWS, GCP, or Azure. Open source contributions More ❯
relationships with cross-functional teams. Ability to clearly communicate and present to internal and external stakeholders. Nice to have, but not essential NLP/Deep learning experience (e.g. huggingface, spaCy) Deep learning framework experience (preferably PyTorch) MLOps experience (e.g. data and model versioning, model deployment CI/CD, MLFlow/DVC) Cloud platform experience, especially from an ML standpoint (AWS More ❯
part of enterprise technology stacks. DataOps/MLOps experience would be valued. Strong Python programming for modelling and/or data analysis is essential, preferably with experience using the spaCy NLP framework and BERT. Knowledge of database design as well as strong experience with SQL queries is desirable. Familiarity with R and other common languages and tools would be beneficial. More ❯
fairness, and compliance. Help shape the data pipeline and MLOps practices for handling sensitive legal content securely. Required Experience: Solid experience with Python and ML/NLP libraries (e.g., spaCy, Hugging Face, TensorFlow/PyTorch). Experience building NLP or document analysis solutions, ideally for legal or compliance use cases. Strong understanding of legal language, structures, and workflows (e.g., contracts More ❯
custom domains via fine tuning or RLHF. Experience in Information Retrieval systems for document question answering. Experience in day-to-day NLP for industry using Python and related toolchains (SpaCy, HuggingFace, NLTK, etc.). Published research in areas of machine learning at major conferences and/or journals. If some of the above doesn't line up perfectly with your More ❯
such as ACL/NIPS/EMNLP/NACL. Strong programming skills - e.g. Python, C++. Experience with Deep Learning/NLP/Machine Learning frameworks such as PyTorch, Tensorflow, Spacy, CoreNLP. Tagged as: Industry , Machine Learning , Natural Language Processing , NLP , United Kingdom More ❯
system integrations across the data stack. Design and support secure, scalable systems using network protocols (TCP/IP, OSI) Enable machine learning and AI workflows through tools like Jupyter, SpaCy, Transformers, and NLTK. Implement and support BI tools (Tableau, Power BI, Kibana) to drive actionable insights from complex data sets. If you are interested in this Big Data Engineer role More ❯
system integrations across the data stack. Design and support secure, scalable systems using network protocols (TCP/IP, OSI) Enable machine learning and AI workflows through tools like Jupyter, SpaCy, Transformers, and NLTK. Implement and support BI tools (Tableau, Power BI, Kibana) to drive actionable insights from complex data sets. If you are interested in this Big Data Engineer role More ❯
system integrations across the data stack. Design and support secure, scalable systems using network protocols (TCP/IP, OSI) Enable machine learning and AI workflows through tools like Jupyter, SpaCy, Transformers, and NLTK. Implement and support BI tools (Tableau, Power BI, Kibana) to drive actionable insights from complex data sets. If you are interested in this Big Data Engineer role More ❯