Back to Browse

LangChain + MCP Server(s)

1.8K views
Jul 20, 2025
18:28

Showcasing 2 different ways of implementing MCP servers with LangChain. Focus is on a completely local running setup so Ollama models also demonstrated. NOTE usage may be very slow going this route, so if (generally) higher quality and faster responses are preferred - swap out with your hosted model of choice. MCP Servers used were Neo4j's MCP Servers connected to a local running graph database. Examples of using the base sample code in an interactive CLI, FastAPI server, and Streamlit app are also presented. Links: Demo code repo: https://github.com/jalakoo/langchain-ollama-neo4j-mcp Langchain MCP Adapter Repo: https://github.com/langchain-ai/langchain-mcp-adapters Official Python MCP SDK: https://github.com/modelcontextprotocol/python-sdk Ollama Models w/ Tool Calling: https://ollama.com/search?c=tools Neo4j MCP Servers: https://github.com/neo4j-contrib/mcp-neo4j Neo4j: https://neo4j.com FastAPI: https://fastapi.tiangolo.com Streamlit: https://streamlit.io Timestamps: 00:00 Intro 00:15 TLDR - Cloning and running demo code 01:45 Streamlit app walkthrough 02:56 Simple single MCP setup 07:08 Multiple MCP Server setup 09:36 Interactive CLI option 11:12 FastAPI option 12:32 Steamlit Code walkthrough 15:26 Simple tests 18:03 Wrap up

Download

0 formats

No download links available.

LangChain + MCP Server(s) | NatokHD