Twinny

Fully offline

Free, lightweight VS Code copilot that runs entirely on Ollama. Strong on autocomplete.

Editorial verdict: “Best minimal-surface Copilot-replacement that's been Ollama-native since day one.

Coding agent
Free
MIT
4.2 / 5
GitHub ★ 3,500

Compatibility at a glance

Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"

§ Runtimes supported
ollama
§ OS / platform
macoslinuxwindows
§ Hardware + model hint
Minimum VRAM
8 GB
Recommended starter model
DeepSeek Coder 6.7B Q4_K_M or Qwen 2.5 Coder 7B

What it is

Twinny is a no-nonsense VS Code extension purpose-built for Ollama. Autocomplete + inline chat + symbol explanation. Smaller surface area than Continue but tighter integration and lower latency for the autocomplete-only use case. Good 'just give me Copilot but local' pick.

✓ Strengths

  • +Tiny config — works out of the box with Ollama running locally
  • +Lower latency than Continue for autocomplete
  • +MIT-licensed, fully open

△ Caveats

  • No JetBrains support
  • Fewer features than Continue (no agentic edit mode)