glm
144B parameters
Restricted

GLM-5 Pro

Zhipu's GLM-5 flagship. 144B total / 16B active MoE. Strong on Chinese-language tasks; competitive on English at the workstation-cluster tier.

License: GLM License·Released Feb 18, 2026·Context: 131,072 tokens

Overview

Zhipu's GLM-5 flagship. 144B total / 16B active MoE. Strong on Chinese-language tasks; competitive on English at the workstation-cluster tier.

Family & lineage

How this model relates to others in its lineage. Family members share architecture and training-data roots; parent / children edges record direct distillation or fine-tune relationships.

Parent / base model
GLM-4 9B9B
Consumer

Strengths

  • Strong CJK
  • MoE efficiency

Weaknesses

  • Restricted commercial license
  • Multi-GPU only

Quantization variants

Each quantization trades model quality for file size and VRAM. Q4_K_M is the most popular starting point.

QuantizationFile sizeVRAM required
AWQ-INT482.0 GB96 GB

Get the model

HuggingFace

Original weights

huggingface.co/THUDM/GLM-5-Pro

Source repository — direct quantization required.

Hardware that runs this

Cards with enough VRAM for at least one quantization of GLM-5 Pro.

Compare alternatives

Models worth comparing

Same parameter band, plus what's one tier above and below — so you can decide what actually fits your hardware.

Step up
More capable — bigger memory footprint
No verdicted models in the next tier up yet.

Frequently asked

What's the minimum VRAM to run GLM-5 Pro?

96GB of VRAM is enough to run GLM-5 Pro at the AWQ-INT4 quantization (file size 82.0 GB). Higher-quality quantizations need more.

Can I use GLM-5 Pro commercially?

GLM-5 Pro is released under the GLM License, which has restrictions for commercial use. Review the license terms before using it in a product.

What's the context length of GLM-5 Pro?

GLM-5 Pro supports a context window of 131,072 tokens (about 131K).

Source: huggingface.co/THUDM/GLM-5-Pro

Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify model claims.