Open-source clone of the ChatGPT UI with multi-provider routing. Local + cloud in one interface.
Editorial verdict: “Best if you mix local + cloud models in the same workflow. Strong team features.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
LibreChat is the closest visual match to ChatGPT, with the same prompt-library, custom-instructions, and chat-folder UX. The selling point is multi-provider routing: drop in Ollama, Anthropic, OpenAI, Gemini, and Together keys, and per-conversation you choose which one answers. Good for teams that want one UI for both local and cloud models.
Web or desktop chat client that connects to your local runtime.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.