BACK_TO_FEEDAICRIER_2
EXAONE 4.5 lands as open-weight VLM
OPEN_SOURCE ↗
REDDIT · REDDIT// 3d agoMODEL RELEASE

EXAONE 4.5 lands as open-weight VLM

LG AI Research released EXAONE 4.5, its first open-weight vision-language model, with 33B total parameters and a 262K-token context window. The Hugging Face release comes in BF16, FP8, and GGUF variants aimed at different deployment stacks.

// ANALYSIS

This is a meaningful multimodal step for EXAONE, but the real story is packaging: long context, multiple runtimes, and a model family tuned for practical deployment rather than just benchmark theater.

  • The model card frames EXAONE 4.5 as a first open-weight VLM from LG AI Research, built on the EXAONE 4.0 stack with a dedicated visual encoder
  • 262K context is a standout deployment feature for document-heavy and agentic workflows, especially if the memory footprint is manageable in FP8 or GGUF
  • Benchmark tables position it competitively on document understanding and Korean reasoning, but not as an obvious across-the-board leader
  • The license is non-commercial, which limits adoption for startups and product teams even if the weights are easy to run locally
  • The Reddit reaction suggests the usual LocalLLaMA split: interest in the model family, skepticism about benchmark claims, and annoyance at restrictive licensing
// TAGS
exaone-4.5-33bllmmultimodalopen-weightsreasoning

DISCOVERED

3d ago

2026-04-09

PUBLISHED

3d ago

2026-04-09

RELEVANCE

9/ 10

AUTHOR

Secure_Smoke_4280