BACK_TO_FEEDAICRIER_2
Poolside's Laguna XS.2 lands open weights
OPEN_SOURCE ↗
REDDIT · REDDIT// 3h agoMODEL RELEASE

Poolside's Laguna XS.2 lands open weights

Poolside released Laguna XS.2, a 33B total / 3B activated MoE coding model with a 131k context window and Apache 2.0 weights. The company positions it as an open-weight agentic model for long-horizon software work, with local deployment support and a limited-time free window via its API and OpenRouter.

// ANALYSIS

This is a meaningful open-weights entry in the coding-model race: Poolside is not just shipping a checkpoint, it’s shipping the training story, agent harness, and deployment path around it.

  • Poolside claims XS.2 reaches 44.5% on SWE-bench Pro and 30.1% on Terminal-Bench 2.0, which puts it in the competitive bracket for 30B-class agentic models
  • The 33B A3B MoE design is the headline efficiency story: strong capability without the inference cost of much larger dense models
  • Apache 2.0 licensing matters here, because it makes the model materially more usable for commercial teams than many “open” releases with restrictive terms
  • The company’s own benchmark setup is an internal agent harness, so the scores are directionally useful but not apples-to-apples with every public leaderboard
  • Local support across vLLM, Transformers, TRT-LLM, Ollama, and MLX lowers adoption friction for developers who want to test or fine-tune it quickly
// TAGS
poolsidelaguna-xs.2llmagentai-codingopen-weightsmcp

DISCOVERED

3h ago

2026-04-28

PUBLISHED

4h ago

2026-04-28

RELEVANCE

9/ 10

AUTHOR

Middle_Bullfrog_6173