OPEN_SOURCE ↗
REDDIT · REDDIT// 3h agoMODEL RELEASE
Mistral launches Medium 3.5 128B flagship
Mistral Medium 3.5 128B is Mistral’s new dense flagship model, released on Hugging Face with a 256k context window and multimodal input support. It unifies instruction following, reasoning, and coding in one set of weights, and Mistral says it replaces Medium 3.1 and Magistral in Le Chat, while also replacing Devstral 2 in its Vibe coding agent. The model card highlights strong agentic performance, including 91.4% on τ³-Telecom and 77.6% on SWE-Bench Verified.
// ANALYSIS
This looks like Mistral’s move to simplify its lineup around one high-capability dense model instead of splitting product surface area across multiple specialist variants.
- –Dense 128B plus 256k context makes it attractive for teams that want a single, high-end model with fewer routing decisions.
- –The “merged model” framing is the real product story here: one model for chat, coding, and reasoning, not a patchwork of separate endpoints.
- –Replacing Devstral 2 in Vibe suggests Mistral thinks the agentic/coding quality is now good enough to consolidate workflows.
- –The benchmark numbers are strong, but the more interesting signal is deployment simplicity for users already in the Mistral ecosystem.
// TAGS
mistralllmmodel-releasemultimodaldense-modelcontext-windowreasoningcodingagentic-ai
DISCOVERED
3h ago
2026-04-29
PUBLISHED
4h ago
2026-04-29
RELEVANCE
9/ 10
AUTHOR
TSrake