I Stopped Sending My Code to the Cloud - Full Local AI Dev Team with Ollama
In this video, I rebuild my entire OpenCode multi‑agent workflow to run on Ollama, so everything is local, private, and free after download. No Copilot dependency, no API key, and your code never leaves your machine. We’ll go step by step: What Ollama is and why local models matter for privacy and cost How to find models on ollama.com and pull them (llama3.2, qwen and friends) The difference between fully local vs :cloud Ollama models, and when to use each Configuring OpenCode to talk to Ollama with an OpenAI‑compatible endpoint Assigning different models per agent (planner, implementor, reviewer, tester, etc.) BONUS: wiring in Serena as an MCP server so agents can query your codebase by symbol instead of grepping files BONUS: adding a dedicated security reviewer agent that audits every change before tests run Full end‑to‑end demo: from planner clarification → implementation → security review → tests → linter → final commit message If you work with sensitive code, are tired of sending everything to hosted APIs, or just want a serious local AI dev setup, this video gives you a complete starting point you can copy into your own repo. ⏱ Chapters 0:00 Intro 0:54 What is Ollama? 2:28 Finding models on ollama.com 4:00 Pulling models locally 6:30 Local vs :cloud models 8:39 Connecting OpenCode to Ollama 10:30 Per‑agent model choices 12:09 BONUS: Serena MCP 14:27 BONUS: Security reviewer agent 17:15 Full workflow demo 18:28 Why this stack works 18:15 What’s next (Copilot CLI vs OpenCode) 🔗 Links Ollama model library: https://ollama.com/library OpenCode: https://opencode.ai Serena MCP: https://github.com/oraios/serena uv (needed for Serena): https://docs.astral.sh/uv Example config & agent files (repo shown in the video): https://github.com/saeid-rez/Multi-Agent-Setup/blob/main/README-LOCAL-AI.md 💬 Tell me in the comments Did you try this workflow? What models did you pick for planner and implementor? What tools are you using today for AI coding, Copilot, OpenCode, Cursor, Claude Code, something else? In the next video I’ll compare GitHub Copilot CLI vs OpenCode, what do you want me to focus on: performance, privacy, setup complexity, cost, or something else? If this was helpful, a like or subscribe really helps the channel and tells me to keep going with deep‑dive, real‑world dev setups instead of just surface‑level demos
Download
0 formatsNo download links available.