OPEN_SOURCE ↗
REDDIT · REDDIT// 18d agoOPENSOURCE RELEASE
text-generation-webui v4.2 adds Claude Code support
Text Generation Web UI is a Gradio-based local LLM interface, and v4.2 adds a new Anthropic-compatible /v1/messages endpoint so Claude Code and Cursor can talk to local models with the API shape Anthropic clients expect. The release also shrinks portable builds, refreshes the UI theme, and tightens training and API defaults.
// ANALYSIS
This is less a UI polish release than a protocol unlock: by speaking Anthropic's API dialect, Text Generation Web UI becomes a much more practical backend for agentic coding workflows.
- –`/v1/messages` supports system messages, content blocks, tool use, tool results, image inputs, and thinking blocks, so the Claude Code compatibility is real rather than a thin shim. [Release notes](https://github.com/oobabooga/text-generation-webui/releases/tag/v4.2)
- –That lowers friction for privacy-first or air-gapped teams that want Anthropic-native clients on top of self-hosted inference.
- –Smaller portable builds and the no-install path make the stack easier to try on fresh machines or carry between boxes.
- –Moving the OpenAI-compatible API into `modules/api`, cleaning up `--extra-flags`, and defaulting `top_p=0.95` all point to a more coherent API surface.
- –Training UI cleanup and `gradient_checkpointing` by default keep the project useful for people who fine-tune locally, not just those who chat.
// TAGS
text-generation-webuillmopen-sourceself-hostedapiinferenceai-coding
DISCOVERED
18d ago
2026-03-24
PUBLISHED
18d ago
2026-03-24
RELEVANCE
9/ 10
AUTHOR
oobabooga4