LLM Session Management with Redis
Developers building AI applications require a way to store the conversation history between an LLM and a user. This is important to provide context to the conversation, reusing previously discussed details to improve the quality and accuracy of responses. This is implemented using the chat message history construct from LangChain. In this video, Ricardo Ferreira, Developer Advocate at Redis, shows how to implement a chat message history using LangChain. He shows how to integrate this history with a LLM powered by OpenAI to reuse answers from the conversation. 00:00 What is the use case? 01:25 Setting up Redis 03:50 Manual chat history 07:20 Chat history with LLMs 15:56 Deleting the data 🧑🏻💻 GitHub repository: ▪️ LangChain apps with Redis: https://github.com/redis-developer/langchain-apps-with-redis 💡 Creating an OpenAI API key: ▪️ https://platform.openai.com/docs/quickstart
Download
0 formatsNo download links available.