RUNLOCALAIv38
→WILL IT RUNBEST GPUCOMPARETROUBLESHOOTSTARTPULSEMODELSHARDWARETOOLSBENCH
RUNLOCALAI

Operator-grade instrument for local-AI hardware intelligence. Hand-written verdicts. Real benchmarks. Reproducible commands.

OP·Fredoline Eruo
DIR
  • Models
  • Hardware
  • Tools
  • Benchmarks
  • Will it run?
GUIDES
  • Best GPU
  • Best laptop
  • Best Mac
  • Best used GPU
  • Best budget GPU
  • Best GPU for Ollama
  • Best GPU for SD
  • AI PC build $2K
  • CUDA vs ROCm
  • 16 vs 24 GB
  • Compare hardware
  • Custom compare
REF
  • Systems
  • Ecosystem maps
  • Pillar guides
  • Methodology
  • Glossary
  • Errors KB
  • Troubleshooting
  • Resources
  • Public API
EDITOR
  • About
  • About the author
  • Changelog
  • Latest
  • Updates
  • Submit benchmark
  • Send feedback
  • Trust
  • Editorial policy
  • How we make money
  • Contact
LEGAL
  • Privacy
  • Terms
  • Sitemap
MAIL · MONTHLY DIGEST
Get monthly local AI changes
Monthly recap. No spam.
DISCLOSURE

Some links on this site are affiliate links (Amazon Associates and other first-class retailers). When you buy through them, we earn a small commission at no extra cost to you. Affiliate links do not influence our verdicts — there are cards we rate highly that we don't have affiliate relationships with, and cards that sell well that we refuse to recommend. Read more →

SYS · ONLINEUPTIME · 100%2026 · operator-owned
RUNLOCALAI · v38
  1. >
  2. Home
  3. /Tools
  4. /Bolt.diy
agent
Open source
free

Bolt.diy

Open-source fork of StackBlitz's bolt.new — full-stack app generator that writes, runs, and iterates on web apps in a sandboxed WebContainer. Bolt.diy adds local-LLM support: point it at Ollama or any OpenAI-compatible local endpoint and prompt-to-app generation works entirely offline. The killer demo for showing non-developers what a local coding agent can do. Limited to web-app generation (Node/React/Vue/Svelte stacks) — not a general-purpose coding agent like Aider or Cline.

By Fredoline Eruo·Last verified May 13, 2026·14,000 GitHub stars

Overview

Open-source fork of StackBlitz's bolt.new — full-stack app generator that writes, runs, and iterates on web apps in a sandboxed WebContainer. Bolt.diy adds local-LLM support: point it at Ollama or any OpenAI-compatible local endpoint and prompt-to-app generation works entirely offline. The killer demo for showing non-developers what a local coding agent can do. Limited to web-app generation (Node/React/Vue/Svelte stacks) — not a general-purpose coding agent like Aider or Cline.

Pros

  • Visual full-stack code-gen — prompt to running app in browser
  • Local-LLM backend support is the headline differentiator from bolt.new
  • Sandboxed execution (WebContainer) is genuinely safe
  • Best 'show your friends what local AI can do' demo in the catalog

Cons

  • Web-app generation only — no terminal/Python/CLI projects
  • Smaller models (<14B) struggle with the full-stack reasoning needed
  • Active fork tree — pin a commit if you're in production

Compatibility

Operating systems
linux
macos
windows
GPU backends
cuda
rocm
metal
cpu
LicenseOpen source · free

Runtime health

Operator-grade signals on how actively Bolt.diy is being maintained, how fresh its measurements are, and what failure classes operators have flagged. Every label below is anchored to a real date or count — we never infer maintainer activity we can't show.

Release cadence

Derived from the most recent editorial signal on this row.

Active
Updated May 13, 2026

1 days since last refresh · source: lastUpdated

Benchmark freshness

How recent the editorial measurements on this runtime are.

0editorial benchmarks

No editorial benchmarks for this runtime yet.

Community reproduction

Submissions that match an editorial measurement on similar hardware.

0reproduced reports

No community reproductions on file yet.

Get Bolt.diy

Official site
https://bolt.diy/
GitHub
https://github.com/stackblitz-labs/bolt.diy

Frequently asked

Is Bolt.diy free?

Yes — Bolt.diy is free to download and use and open-source under a permissive license.

What operating systems does Bolt.diy support?

Bolt.diy supports linux, macos, windows.

Which GPUs work with Bolt.diy?

Bolt.diy supports cuda, rocm, metal, cpu. CPU-only inference is also possible but slow.
See something off?Report outdated·Suggest a correctionWe read every submission. Editorial review takes 1-7 days.

Reviewed by RunLocalAI Editorial. See our editorial policy for how we evaluate tools.

Related — keep moving

Compare hardware
  • RTX 3090 vs RTX 4090 →
Buyer guides
  • Best AI PC for developers →
  • Best GPU for Ollama (coding) →
When it doesn't work
  • Ollama running slow →
  • CUDA out of memory →
Recommended hardware
  • RTX 3090 (used 24 GB) →
Alternatives
AGiXTCodex CLIClineDevinOpenCodeKilo CodeOpenAI CodexDroid (Factory)
Before you buy

Verify Bolt.diy runs on your specific hardware before committing money.

Will it run on my hardware? →Custom hardware comparison →GPU recommender (4 questions) →