BACK_TO_FEEDAICRIER_2
DeepSeek V4 drops with 1.6T MoE, 1M context
OPEN_SOURCE ↗
REDDIT · REDDIT// 4h agoMODEL RELEASE

DeepSeek V4 drops with 1.6T MoE, 1M context

DeepSeek drops V4 with a 1.6T-parameter Pro model and a 284B Flash variant, both built for 1M context and agentic coding. The release is now live across DeepSeek's official services and API.

// ANALYSIS
  • V4-Pro reportedly rivals top closed-source models in reasoning and coding, reinforcing DeepSeek's open-weight strategy.
  • The 1M default context is the concrete standout for long-horizon agent workflows and large refactors.
  • Flash is the cost-efficient option for broad use, while Pro is aimed at heavier workloads and larger deployments.
// TAGS
deepseek-v4llmreasoningopen-weightsai-codinginference

DISCOVERED

4h ago

2026-04-25

PUBLISHED

6h ago

2026-04-25

RELEVANCE

10/ 10

AUTHOR

techlatest_net