BACK_TO_FEEDAICRIER_2
Qwen3-Coder 30B goes open, rivals Claude Sonnet
OPEN_SOURCE ↗
HN · HACKER_NEWS// 2h agoOPENSOURCE RELEASE

Qwen3-Coder 30B goes open, rivals Claude Sonnet

Alibaba's Qwen team open-sourced Qwen3-Coder, a family of agentic coding models including a 30B-A3B MoE variant (3B active parameters) under Apache 2.0 — delivering Claude Sonnet-level agentic coding performance at a fraction of the inference cost. The flagship 480B-A35B sets new open-model SOTA on SWE-Bench Verified, agentic browser-use, and tool-use benchmarks.

// ANALYSIS

An open-source MoE model that activates only 3B parameters yet competes with proprietary coding giants is a forcing function that reshapes the self-hosted AI stack overnight.

  • The 30B-A3B variant fits in consumer VRAM and runs locally — democratizing agentic coding without API costs or data privacy tradeoffs
  • Trained on 7.5T tokens (70% code) with execution-driven RL across 20,000 parallel environments: this is serious infrastructure-grade training discipline, not a finetune
  • Ships with Qwen Code, an open-source CLI agent (adapted from Claude Code) — Alibaba is competing at the tooling layer, not just the model layer
  • 256K native context (extendable to 1M) makes it viable for repo-scale tasks where smaller context windows fall apart
  • Apache 2.0 license removes the commercial friction that limits Meta's Llama for enterprise deployment
// TAGS
qwen3-coderllmai-codingagentopen-weightsopen-sourceinferencemcp

DISCOVERED

2h ago

2026-04-16

PUBLISHED

3h ago

2026-04-16

RELEVANCE

9/ 10

AUTHOR

cmitsakis