Configuration
Verified by owner

Ollama: connection refused on localhost:11434

Error: connect ECONNREFUSED 127.0.0.1:11434
By Fredoline Eruo · Last verified May 6, 2026

Cause

Your client (curl, an SDK, or another tool) is trying to reach the Ollama HTTP server at port 11434 but Ollama isn't running there. Common causes:

  • ollama serve was never started (most common on Linux without the systemd service)
  • Ollama is running but bound to a different host/port (e.g., OLLAMA_HOST=0.0.0.0 for remote access)
  • Firewall blocking localhost (rare but happens with overzealous corp security)

Solution

Start Ollama:

# macOS / Linux foreground
ollama serve
# In another terminal, verify:
curl http://localhost:11434/api/tags

Linux (run as a service):

sudo systemctl start ollama
sudo systemctl enable ollama  # auto-start on boot

Windows: The Ollama installer registers a Windows service that should auto-start. Check Services (services.msc) for "Ollama" — set it to Automatic.

If running on a non-default port, point your client there:

export OLLAMA_HOST=http://localhost:11500
# Or in your code, set the base URL explicitly

If running in Docker, make sure the port is published:

docker run -d -p 11434:11434 --name ollama ollama/ollama

Related errors

Did this fix it?

If your case was different, email hello@runlocalai.co with what you saw and we'll update the page. If it worked but took different commands on your platform, we want to know that too.