LM Studio Needs Better Cursor-Style Editor
The thread asks for a Windows-native, Cursor-like code editor that works with local models from LM Studio or llama.cpp, including agents, autocomplete, lint fixing, and image chat. The consensus gap is not inference itself, but a polished editor layer that can hold context reliably and expose the model through a real coding workflow.
The honest answer is that this is still a stack problem, not a single-product problem: LM Studio is the local model server, while the editor experience is coming from plugins or newer AI-first editors. If you want the closest practical path in May 2026, you are comparing Continue, Zed, and the now-paused Void, not finding a true local Cursor clone.
- –Continue is still the most mature open-source bridge for VS Code and JetBrains, and it explicitly supports LM Studio via its OpenAI-compatible endpoint.
- –LM Studio itself is strong as the local runtime, but it is not trying to be the full coding IDE, so context handling and agent orchestration depend on the editor integration.
- –Void looked closest to a Cursor-style fork with tab completion, lint detection, agent mode, and direct model connections, but its own site says the project is paused.
- –Zed is now a serious contender because it is native, agentic, supports Windows, and can connect to local or OpenAI-compatible providers, but it is a different workflow than VS Code/Cursor.
- –The “context gets cut off” complaint usually points to provider limits, prompt packing, or tool-loop overhead, which means swapping editors alone may not fix it.
- –For screenshot/image chat, a GUI editor with built-in AI support is the right requirement; that rules out CLI-first tools, but it does not yet point to one perfect local option.
DISCOVERED
3h ago
2026-05-01
PUBLISHED
5h ago
2026-05-01
RELEVANCE
AUTHOR
jingtianli