nvidia
GPU
80GB VRAM
workstation
NVIDIA H100 PCIe
PCIe Hopper. Lower power, lower bandwidth than SXM. Server-tier.
Released 2022
Overview
PCIe Hopper. Lower power, lower bandwidth than SXM. Server-tier.
Specs
| VRAM | 80 GB |
| Power draw | 350 W |
| Released | 2022 |
| MSRP | $25000 |
| Backends | CUDA |
Models that fit
Open-weight models small enough to run on NVIDIA H100 PCIe with usable context.
Frequently asked
What models can NVIDIA H100 PCIe run?
With 80GB VRAM, the NVIDIA H100 PCIe runs 70B models in 4-bit quantization, plus everything smaller. See the model list below for tested combinations.
Does NVIDIA H100 PCIe support CUDA?
Yes — NVIDIA H100 PCIe is an NVIDIA card with full CUDA support, the most mature local-AI backend. llama.cpp, Ollama, vLLM, and ExLlamaV2 all run natively.
Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify hardware specifications.