Official Node + browser SDK for Ollama. ESM-first, typed, streaming.
Editorial verdict: “Foundational primitive for Node + browser apps against Ollama. ESM-native, typed.”
Which runtime + OS combos this app works against. Source of truth for "will it run on my setup?"
Official JavaScript / TypeScript SDK for Ollama. Works in Node and in the browser when Ollama allows CORS. Streaming, typed responses, ESM-native. The right primitive for full-stack apps that want to talk to Ollama from the browser or a Node server.
Thin SDK / proxy / compatibility layer.
Pre-filled with this app's recommended use case + budget tier. Get the full rig + runtime + model picks.
The full directory — filter by category, runtime, OS, privacy posture, or VRAM.
What this app talks to: Ollama, vLLM, llama.cpp, MLX, LM Studio. The upstream layer.
Did this app work for you on a specific rig? Submit the benchmark — it powers the model + hardware pages.