BACK_TO_FEEDAICRIER_2
LM Studio 0.4.7 beta adds presence penalty
OPEN_SOURCE ↗
REDDIT · REDDIT// 32d agoPRODUCT UPDATE

LM Studio 0.4.7 beta adds presence penalty

LM Studio’s 0.4.7 beta adds a `presence_penalty` sampling parameter, alongside API compatibility and UI fixes in the new beta build. For developers using LM Studio as a local OpenAI-compatible inference endpoint, this is a small but meaningful step toward smoother drop-in parity with cloud-first tooling.

// ANALYSIS

This is the kind of incremental update that matters more to builders than to casual users: one missing sampling knob can quietly break portability between local and hosted LLM workflows.

  • The official 0.4.7 beta release notes explicitly list `presence_penalty` as a new sampling parameter, confirming the Reddit post’s claim
  • LM Studio already positions itself as an OpenAI-compatible local server with `/v1/chat/completions` and `/v1/responses`, so parameter parity reduces friction for existing clients and scripts
  • The same beta also adds a `parallel` parameter to `/api/v1/load`, suggesting the team is still tightening developer-focused API ergonomics, not just polishing the desktop app
  • This is still an incremental release, not a headline-grabbing launch, but local inference tools win by steadily removing edge-case incompatibilities like this one
// TAGS
lm-studiollmapidevtoolinference

DISCOVERED

32d ago

2026-03-10

PUBLISHED

36d ago

2026-03-07

RELEVANCE

8/ 10

AUTHOR

ZootAllures9111