Sunday, March 31, 2024
LlamaIndex Upgrade to 0.10.x Experience
I explain key points you should keep in mind when upgrading to LlamaIndex 0.10.x.
Labels:
LlamaIndex,
LLM,
RAG
Monday, March 25, 2024
LLM Structured Output for Function Calling with Ollama
I explain how function calling works with LLM. This is often confused concept, LLM doesn't call a function - LLM retuns JSON response with values to be used for function call from your environment. In this example I'm using Sparrow agent, to call a function.
Sunday, March 17, 2024
FastAPI File Upload and Temporary Directory for Stateless API
I explain how to handle file upload with FastAPI and how to process the file by using Python temporary directory. Files placed into temporary directory are automatically removed once request completes, this is very convenient for stateless API.
Sunday, March 10, 2024
Optimizing Receipt Processing with LlamaIndex and PaddleOCR
LlamaIndex Text Completion function allows to execute LLM request combining custom data and the question, without using Vector DB. This is very useful when processing output from OCR, it simplifies the RAG pipeline. In this video I explain, how OCR can be combined with LLM to process image documents in Sparrow.
Labels:
LlamaIndex,
LLM,
RAG
Sunday, March 3, 2024
LlamaIndex Multimodal with Ollama [Local LLM]
I describe how to run LlamaIndex Multimodal with local LlaVA LLM through Ollama. Advantage of this approach - you can process image documents with LLM directly, without running through OCR, this should lead to better results. This functionality is integrated as separate LLM agent into Sparrow.
Labels:
LlamaIndex,
LLM,
RAG
Subscribe to:
Posts (Atom)