LLM electricity calculator
What does it actually cost to run a local LLM on your hardware? Most "is local AI cheaper?" takes ignore that GPUs draw most of their power only under sustained load and that electricity prices vary 5-10× across regions. This one's explicit about both.
Pure client. No tracking. Math is open and citable.
Inputs
Cost estimate
Caveats
GPU power is only part of total system draw — add ~120 W for CPU, RAM, motherboard, fans, cooling. Reflected in the second stat above.
Idle power isn't counted in the headline number — most modern GPUs idle at 10-40 W, but if the machine is on 24/7 that adds up (the third stat models always-on idle at 30 W).
Electricity prices vary 5-10× across regions. Hawaii is ~$0.40/kWh; Quebec is ~$0.07. Use your actual bill.
This is operating cost only. Hardware amortization (a $1,500 4090 over 3 years = ~$42/mo) typically dwarfs electricity — see the full local-AI cost breakdown.
Embed this calculator
Link to this page from articles, READMEs, or community threads — screenshot welcome, attribution appreciated.
Suggested citation: Calculator by RunLocalAI · runlocalai.co · CC-BY-4.0
Related: How much does local AI cost? · Local AI vs ChatGPT Plus · VRAM calculator