OPEN_SOURCE ↗
YT · YOUTUBE// 4h agoMODEL RELEASE
Tencent launches 295B-parameter agentic Hy3 Preview
Tencent has overhauled its AI strategy with the release of Hy3 Preview, a Mixture-of-Experts (MoE) model reconstructed from the ground up for autonomous agentic capabilities. The model features 295 billion total parameters with 21 billion activated, a 256K context window, and is being deeply integrated across Tencent's massive ecosystem of consumer and developer tools including WeChat, QQ, and Tencent Cloud.
// ANALYSIS
Tencent is pivoting from "LLM as a chatbot" to "LLM as an operating system," leveraging its dominant app footprint to embed autonomous agents at scale.
- –Reconstructed architecture abandons previous foundations to optimize for reasoning and agentic workflows rather than simple text generation.
- –Massive 295B scale with efficient activated parameters allows for industry-leading performance (74.4% on SWE-bench) without prohibitive inference costs.
- –Deep integration into ubiquitous platforms like WeChat and QQ provides a unique distribution advantage for "agentic" services that competitors lack.
- –Open-sourcing the preview version is a tactical move to foster a developer ecosystem around Hunyuan before the official commercial rollout.
// TAGS
tencenthy3-previewmoeagentic-aillmopen-sourcechinese-ai
DISCOVERED
4h ago
2026-04-26
PUBLISHED
4h ago
2026-04-26
RELEVANCE
8/ 10
AUTHOR
AI Search