Vite. GenAI Frameworks: Familiarity with tools like LangChain, LangGraph, CrewAI, or Vercel AI (in Python or JavaScript). LLM experience: Practical experience with Large Language Models (LLMs) such as OpenAI, Anthropic, Mistral, or Llama, and tools like Ollama. Cloud Platforms: Proficiency in AWS or Azure for hosting and deployment. Containerization and Orchestration: Hands-on experience with Docker, Kubernetes, and Docker More ❯
understanding of the Azure ecosystem. The ideal candidate will bring hands-on expertise in designing and building AI-driven solutions using Azure-native tools and frameworks such as Azure OpenAI, Prompt Flow, Semantic Kernel, Azure AI SDKs (including azure-ai-projects, azure-ai-inference), and open-source frameworks like LangChain. Familiarity with traditional Azure AI services (e.g. Document Intelligence, Vision … and deploy AI solutions using Azure Machine Learning services. Have a knowledge of top ML libraries scalable across multiple CPU’s and GPU’s. Such as TensorFlow, PyTorch, Keras, OpenAI, XGBoost. And have used these in a commercial setting. Troubleshoot and resolve issues related to data processing and AI model performance. Collaborate with cross-functional teams to understand data requirements. More ❯
core of our platform is a RAG-based architecture, purpose-built for finance. It’s designed to replicate the kind of deep reasoning you see in leading models like OpenAI’s o1—capable of parsing nuanced questions, retrieving domain-relevant knowledge, and producing context-aware answers with minimal hallucination. You’ll work with state-of-the-art AI stacks, combining More ❯