Back to Browse

Function Calling with LLM using LangChain + Ollama

6.7K views
Nov 7, 2024
17:46

Function calling in Large Language Models is essential to provide it with the access to real time data, take actions on your behalf or perform computation. Learn how to use Function Calling in your LLM application utilizing LangChain, Ollama and Streamlit! --- 🔨Tools: - LangChain: https://python.langchain.com/docs/introduction/ - Ollama: https://ollama.dev/ - Streamlit: https://docs.streamlit.io/ --- Code Example: https://github.com/yankeexe/llm-function-calling-demo Ollama Blog (Tool/Function Calling): https://ollama.com/blog/tool-support --- ⚡️ Follow me: - Github: https://github.com/yankeexe - LinkedIn: https://www.linkedin.com/in/yankeemaharjan - Twitter (X): https://x.com/yankexe - Website: https://yankee.dev -- 🎞️ Chapters 0:00 Intro 1:33 Project setup 2:05 Coding 14:55 Internal: LLM Output with Tool 15:43 Internal: Function Context - JSON Schema 17:30 Outro

Download

1 formats

Video Formats

360pmp426.1 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Function Calling with LLM using LangChain + Ollama | NatokHD