BACK_TO_FEEDAICRIER_2
Cline Kanban hits provider-routing bug
OPEN_SOURCE ↗
REDDIT · REDDIT// 1d agoPRODUCT UPDATE

Cline Kanban hits provider-routing bug

Official Cline docs describe Kanban as CLI-agnostic and compatible with Cline CLI, and Cline itself supports local models through Ollama, LM Studio, and OpenAI-compatible endpoints. That makes local LLM use a supported path in principle, so if Kanban is ignoring a llama.cpp-compatible base URL and trying to hit OpenAI directly, the most likely explanation is a bug or regression in how Kanban is wiring the agent/provider settings rather than an intentional cloud-only design.

// ANALYSIS

Hot take: this does not read like an “online models only” product; it reads like a config path that is being bypassed somewhere.

  • Cline Kanban is documented as working with CLI-based agents, including Cline CLI, and Cline CLI already supports local/OpenAI-compatible providers.
  • The official docs explicitly mention local models via Ollama and LM Studio, plus OpenAI-compatible endpoints.
  • If Kanban is forcing OpenAI behavior despite a local base URL, that points to a bug in Kanban’s handoff to the agent runtime or provider selection.
  • Recent community reports suggest similar OpenAI-compatible/local-model routing issues, which strengthens the bug hypothesis.
  • Best interpretation: local LLM support should be possible, but this specific flow may still be broken and worth filing with logs.
// TAGS
clinekanbanlocal-firstllama.cppopenai-compatiblecoding-agentai-toolsai-coding

DISCOVERED

1d ago

2026-05-01

PUBLISHED

1d ago

2026-05-01

RELEVANCE

8/ 10

AUTHOR

PairOfRussels