Codeium self-hosted enterprise backend lets the popular IDE plugin run fully on your hardware.
Editorial verdict: “Best 'enterprise Copilot' replacement when self-hosting is mandatory. Paid tier.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Codeium is a popular cloud autocomplete service. Their enterprise tier ships a self-hosted backend you run on your own GPUs. The IDE plugins (VS Code, JetBrains, Neovim, etc.) are identical to cloud mode but point at your server. Useful when teams want the Codeium UX with no data leaving their network.
Plugin for VS Code, JetBrains, Vim, etc.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.