Obsidian plugin that wires Ollama / OpenAI into your notes. Inline chat, summarize, prompt-templates.
Editorial verdict: “Best Obsidian plugin for local LLM in your notes. Pair with Smart Connections for RAG.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Obsidian Copilot is an Obsidian community plugin that connects to Ollama or OpenAI-compatible endpoints. Inline chat, summarize-selection, prompt templates, vault-wide search-and-ask. Pairs perfectly with the Obsidian Smart Connections plugin for full RAG over notes.
Plugin for VS Code, JetBrains, Vim, etc.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.