nvidia
GPU
188GB VRAM
workstation

NVIDIA H100 NVL

Dual-card H100 with 188GB combined memory. Built for LLM serving.

Released 2023

Overview

Dual-card H100 with 188GB combined memory. Built for LLM serving.

Specs

VRAM188 GB
Power draw800 W
Released2023
MSRP$60000
Backends
CUDA

Models that fit

Open-weight models small enough to run on NVIDIA H100 NVL with usable context.

Frequently asked

What models can NVIDIA H100 NVL run?

With 188GB VRAM, the NVIDIA H100 NVL runs 70B models in 4-bit quantization, plus everything smaller. See the model list below for tested combinations.

Does NVIDIA H100 NVL support CUDA?

Yes — NVIDIA H100 NVL is an NVIDIA card with full CUDA support, the most mature local-AI backend. llama.cpp, Ollama, vLLM, and ExLlamaV2 all run natively.

Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify hardware specifications.