Is GLM-5 cheaper than Kimi v1-128k?
Yes, GLM-5 is primarily cheaper with a starting input price of $0.111/1M tokens, which is about 60% less than Kimi v1-128k's $0.278/1M tokens.
Pricing Intelligence
When comparing GLM-5 and Kimi v1-128k API pricing for 2026, GLM-5 is the more cost-effective option for basic input queries. It is approximately 60% cheaper starting at $0.111/1M tokens. However, your choice should also factor in context window limits (128k vs 131k) and supported modalities.
| Feature | GLM-5 | Kimi v1-128k |
|---|---|---|
| Starting Input Price | $0.111 / 1M tokens | $0.278 / 1M tokens |
| Model Category | GLM | Kimi |
| Max Context Window | 128k tokens | 131k tokens |
| Supported Modalities | text, image | text, image |
| Routed Channels Count | 3 channels | 3 channels |
Yes, GLM-5 is primarily cheaper with a starting input price of $0.111/1M tokens, which is about 60% less than Kimi v1-128k's $0.278/1M tokens.
GLM-5 supports a maximum input context of 128,000 tokens, while Kimi v1-128k supports 131,072 tokens.