BACK_TO_FEEDAICRIER_2
LM Studio MCP stalls in Open WebUI
OPEN_SOURCE ↗
REDDIT · REDDIT// 16d agoINFRASTRUCTURE

LM Studio MCP stalls in Open WebUI

LM Studio's Context7 MCP works inside its own chat UI, but the same tools disappear when the user routes prompts through Open WebUI in a Windows + Docker setup. The thread shows how local AI stacks still need explicit wiring at every layer: model server, MCP client, and web UI.

// ANALYSIS

My read: this is less a broken model server than a transport-and-registration mismatch. MCP is the right glue for local AI, but these tools still behave like separate systems until each one is configured to speak the same protocol.

  • LM Studio supports MCP in the desktop app and via API, so `/use context7` succeeding is a strong sign the Context7 side is healthy.
  • Open WebUI only gained native MCP support in v0.6.31+, and it expects MCP (Streamable HTTP), not an OpenAPI-style tool blob.
  • If Context7 is exposed over stdio or SSE, Open WebUI will need a bridge like `mcpo`, otherwise the connection can look enabled without surfacing tools.
  • In mixed Windows + Docker deployments, the failure point is often reachability and config scope, not the model itself.
// TAGS
lm-studioopen-webuicontext7mcpapiself-hosteddevtoolllm

DISCOVERED

16d ago

2026-03-26

PUBLISHED

16d ago

2026-03-26

RELEVANCE

7/ 10

AUTHOR

supracode