Compare/API Pricing

GLM-5

Starting at $0.111/1M tkns
VS

Kimi v1-128k

Starting at $0.278/1M tkns

AI Overview: TL;DR

When comparing GLM-5 and Kimi v1-128k API pricing for 2026, GLM-5 is the more cost-effective option for basic input queries. It is approximately 60% cheaper starting at $0.111/1M tokens. However, your choice should also factor in context window limits (128k vs 131k) and supported modalities.

Technical Specifications & Pricing Table

FeatureGLM-5Kimi v1-128k
Starting Input Price$0.111 / 1M tokens$0.278 / 1M tokens
Model CategoryGLMKimi
Max Context Window128k tokens131k tokens
Supported Modalitiestext, imagetext, image
Routed Channels Count3 channels3 channels

Frequently Asked Questions

Is GLM-5 cheaper than Kimi v1-128k?

Yes, GLM-5 is primarily cheaper with a starting input price of $0.111/1M tokens, which is about 60% less than Kimi v1-128k's $0.278/1M tokens.

Which model has a larger context window?

GLM-5 supports a maximum input context of 128,000 tokens, while Kimi v1-128k supports 131,072 tokens.