Terminal coding agent that edits files via your local model. Git-aware, surgical, fast.
Editorial verdict: “Best terminal-native coding agent for local models. Qwen 2.5 Coder 32B is its sweet spot.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Aider runs in your terminal, reads code, and proposes edits as git diffs. Works against any OpenAI-compatible endpoint, including Ollama and llama.cpp running locally. The killer feature is the surgical edit format: aider gets the model to emit small, targeted diffs that almost always apply cleanly, even with mid-tier local models like Qwen 2.5 Coder 32B.
Editor-integrated or CLI agent that edits code via your model.
Best self-hosted server for teams. SSO + audit logs make it the IT-friendly pick.
Best minimal-surface Copilot-replacement that's been Ollama-native since day one.
Best IDE-integrated agent that fully respects 'all local' as a first-class option.
Best Copilot replacement that defaults to local. Configurable; pair with Qwen 2.5 Coder.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.