Zhipu GLM-5.1 coding model matches Claude Opus 4.5
Zhipu AI's new 744B-parameter GLM-5.1 model is built specifically for long-horizon agentic workflows and achieves state-of-the-art results on SWE-bench. With a massive 128K max output limit and native MCP support, it targets complex system development workflows established by tools like Claude Code.
GLM-5.1 marks a critical shift from simple chat-based code generation to autonomous, agentic engineering. Native integration with MCP and tools like Claude Code makes it a drop-in replacement for complex local workflows. A 128K output token limit allows the model to generate or refactor entire modules in a single pass without truncating. Training on domestic Huawei Ascend chips signals growing Chinese hardware self-reliance in the foundation model space. While currently available via API, open-source weights under an MIT license are expected soon, providing a powerful local alternative to proprietary models.
DISCOVERED
14d ago
2026-03-28
PUBLISHED
14d ago
2026-03-28
RELEVANCE
AUTHOR
Income stream surfers