Terminal entry into Khoj's local AI assistant. Use grep, get answers, never leave the shell.
Editorial verdict: “Best terminal companion for note-summarization workflows. Pipe-friendly.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Khoj also ships a CLI you can pipe into. `grep -r 'foo' . | khoj 'summarize what we use foo for'` works. For shell-power-user note-taking and local-doc summarization. Pairs with the Khoj server (also in this directory).
Note-taking, knowledge management, or workflow apps with AI.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.