BACK_TO_FEEDAICRIER_2
Ollama GGUF imports break Codex, Claude tools
OPEN_SOURCE ↗
REDDIT · REDDIT// 29d agoINFRASTRUCTURE

Ollama GGUF imports break Codex, Claude tools

A LocalLLaMA user reports that Ollama models imported via GGUF fail with a “does not support tools” API error when used through Codex or Claude Code, even when the underlying models are known to support tool use. In the same setup, models pulled directly from Ollama’s registry (like gpt-oss and qwen3-coder) work, pointing to a capability-metadata mismatch in the import path rather than a general client configuration issue.

// ANALYSIS

This looks less like a model-quality problem and more like an Ollama capability declaration gap for GGUF-imported models, which breaks agent-style CLIs that hard-require tool support.

  • The error is explicit about tool capability, so the integration likely fails before any meaningful generation step.
  • Registry-pulled models working in the same environment is a strong control signal that the client setup is mostly correct.
  • GGUF imports can lose or mis-map chat template/tool metadata, creating “supports tools” false negatives for downstream clients.
  • For AI coding workflows, this is a practical reliability issue because Codex/Claude Code depend on stable tool-calling contracts.
// TAGS
ollamallminferenceclidevtoolopen-source

DISCOVERED

29d ago

2026-03-14

PUBLISHED

29d ago

2026-03-13

RELEVANCE

7/ 10

AUTHOR

Mixolydian-Nightmare