MiniCPM 3 4B
OpenBMB's edge-optimized 4B. MIT license; designed for phone deployment. Strong reasoning per parameter.
Overview
OpenBMB's edge-optimized 4B. MIT license; designed for phone deployment. Strong reasoning per parameter.
Strengths
- MIT license
- Phone-deployable
- Strong reasoning per param
Weaknesses
- 4B ceiling limits open-ended generation depth
Quantization variants
Each quantization trades model quality for file size and VRAM. Q4_K_M is the most popular starting point.
| Quantization | File size | VRAM required |
|---|---|---|
| Q4_K_M | 2.4 GB | 4 GB |
Get the model
HuggingFace
Original weights
Source repository — direct quantization required.
Hardware that runs this
Cards with enough VRAM for at least one quantization of MiniCPM 3 4B.
Frequently asked
What's the minimum VRAM to run MiniCPM 3 4B?
Can I use MiniCPM 3 4B commercially?
What's the context length of MiniCPM 3 4B?
Source: huggingface.co/openbmb/MiniCPM3-4B
Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify model claims.
Related — keep moving
Verify MiniCPM 3 4B runs on your specific hardware before committing money.