The default chat UI for solo Ollama users. Multi-model, built-in RAG, web search, Docker-friendly.
Editorial verdict: “Best default chat UI for solo Ollama users. Pick this first; switch only if you outgrow it.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Open WebUI started life as 'Ollama WebUI' and now speaks to Ollama, OpenAI-compatible endpoints, and external API providers via a single config. Out of the box: multi-conversation, persistent chats, built-in RAG against uploaded files, web search hooks, image-gen integration via ComfyUI. The most popular path from 'I installed Ollama' to 'I'm using a real chat UI' for 2024-2026.
Web or desktop chat client that connects to your local runtime.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.