Browser sidebar that talks to your local Ollama. Summarize pages, chat, vision support.
Editorial verdict: “Best 'sidebar AI' browser extension that's truly local-first.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Page Assist is a Chrome / Firefox extension that pops a sidebar that talks to your local Ollama. Summarize the current page, paste an image at a vision model, build a small knowledge base from URLs. The cleanest 'browse with my local LLM' integration.
Browser extension that uses your local model.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.