BACK_TO_FEEDAICRIER_2
LM Studio proxy brings Copilot Chat local
OPEN_SOURCE ↗
REDDIT · REDDIT// 5d agoOPENSOURCE RELEASE

LM Studio proxy brings Copilot Chat local

copilot-ollama-proxy is a small proxy server that makes VS Code’s built-in GitHub Copilot Chat work with LM Studio by emulating the Ollama API surface Copilot expects. The creator says Copilot only needs Ollama for model discovery, while the actual chat requests use OpenAI-compatible endpoints, so this tool bridges that gap with LM Studio’s REST API and ships with a prebuilt JavaScript release for Node or Bun.

// ANALYSIS

Hot take: this is a clever compatibility shim for people who want Copilot Chat plus local models, but it is fundamentally a workaround around an integration that still assumes Ollama-shaped behavior.

  • Strong fit for LM Studio users who do not want to install or manage Ollama.
  • The value is in the thin translation layer, not in any new model capability.
  • Likely to be brittle if Copilot changes its Ollama discovery assumptions or endpoint expectations.
  • Best suited to tinkerers and local-LLM power users, not mainstream users who want a turnkey setup.
// TAGS
lm-studiocopilot-chatvscodeollamaproxylocal-llmopen-sourcedevtool

DISCOVERED

5d ago

2026-04-06

PUBLISHED

5d ago

2026-04-06

RELEVANCE

8/ 10

AUTHOR

x0wl