Blog about Oracle, Full Stack, Machine Learning and Cloud
Monday, March 25, 2024
LLM Structured Output for Function Calling with Ollama
I explain how function calling works with LLM. This is often confused concept, LLM doesn't call a function - LLM retuns JSON response with values to be used for function call from your environment. In this example I'm using Sparrow agent, to call a function.
No comments:
Post a Comment