Back to Browse

Using Ollama with Agents in Langflow

8.2K views
Feb 6, 2025
7:35

Join David Jones-Gilardi as he guides you through using local Ollama models in your agents. Ollama empowers you to run models locally on your machine, offering a secure alternative to public providers like OpenAI. In this tutorial, discover how to set up and use Alibaba's Qwen 2.5 model with Langflow, enabling seamless integration with your applications while ensuring data privacy. Perfect for developers looking to enhance their AI workflows securely! Resources Langflow (Open Source): http://www.langflow.org Ollama (local models): https://ollama.com Additional Resources: - DataStax Developer Hub: https://dtsx.io/devhub - DataStax Blog: https://dtsx.io/howto - Try Langflow: https://dtsx.io/trylangflow - Try Astra DB: https://dtsx.io/40kQpI6 ____________________ Stay in touch: - Join our Discord Community: https://discord.gg/datastax - Follow us on X: https://x.com/DataStaxDevs Chapters: 00:00:00 | 👋Introduction to Tool Calling with Ollama 00:00:12 | 🤔 Why Use Ollama? 00:00:39 | 🔧Setting Up Qwen 2.5 Model 00:01:54 | 🌐Using Langflow with Ollama 00:03:12 | 🔄 Converting OpenAI to Ollama 00:03:25 | ⚙️Configuring Custom Model in Langflow 00:05:18 | 🚀 Running Queries with Qwen 2.5 00:05:57 | ⏱️ Performance Considerations for Local Models 00:07:12 | 🔑 Final Tips and API Integration _________ #Ollama #LocalModels #Langflow #AI #DataPrivacy #ToolCalling #QwenModel #DeveloperTutorial #SecureAI

Download

1 formats

Video Formats

360pmp412.2 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Using Ollama with Agents in Langflow | NatokHD