BACK_TO_FEEDAICRIER_2
Ollama GGUF models miss Copilot Tools
OPEN_SOURCE ↗
REDDIT · REDDIT// 3d agoPRODUCT UPDATE

Ollama GGUF models miss Copilot Tools

Custom GGUF models imported into Ollama, including Qwen3 Coder Next and DeepSeek Coder V2, are failing to display the mandatory "Tools" capability in VS Code’s Copilot Chat Manager. This prevents these models from appearing in the model picker for agentic tasks like codebase indexing and terminal interaction, despite working correctly in the standard Ollama API.

// ANALYSIS

The "Tools" badge is the gatekeeper for VS Code's Agent Mode, and its absence highlights the metadata configuration gap between official model registries and manual GGUF imports. VS Code strictly filters local models based on the @capability:tools attribute to ensure support for features like @workspace and @terminal. Official library models like qwen3:30b include pre-configured function-calling schemas that VS Code's native Ollama provider detects automatically, while custom imports often lack the necessary Modelfile templates. The 80B Qwen3-Coder-Next APEX is specifically optimized for multi-step agentic workflows, making this detection failure a major bottleneck for high-performance local coding setups. Forcing Ollama through an "OpenAI Compatible" provider bridge remains the most reliable workaround to expose function-calling capabilities to the Copilot UI.

// TAGS
ollamavs-codegithub-copilotai-codingllmself-hostedagent

DISCOVERED

3d ago

2026-04-09

PUBLISHED

3d ago

2026-04-08

RELEVANCE

8/ 10

AUTHOR

SentenceKindly4594