BACK_TO_FEEDAICRIER_2
Continue Ollama config hides local models
OPEN_SOURCE ↗
REDDIT · REDDIT// 10d agoTUTORIAL

Continue Ollama config hides local models

This looks more like a Continue YAML-schema mismatch than an Ollama problem. Continue's current docs expect schema v1 and fold autocomplete into the main models list, so a config that looks close can still load as "No models configured."

// ANALYSIS

The likely failure mode is that the extension is parsing the file as invalid or legacy config, then falling back to an empty model registry.

  • Continue’s YAML docs require `schema: v1`; the posted config shows `version: 1` but no schema header, which is the biggest red flag.
  • In YAML, `tabAutocompleteModel` is deprecated; autocomplete should live under `models` with `roles: [autocomplete]`.
  • `apiBase: http://127.0.0.1:11434` matches the Ollama docs, so the transport layer looks fine compared with the config shape.
  • On Windows, it is worth verifying the file is actually in `%USERPROFILE%\.continue\config.yaml`, not just a path that looks right in another shell.
  • If the user just wants local Ollama detection, Continue’s `model: AUTODETECT` path is the docs-backed fallback that populates from local `ollama list`.
// TAGS
continueollamaideai-codingself-hostedllm

DISCOVERED

10d ago

2026-04-01

PUBLISHED

10d ago

2026-04-01

RELEVANCE

7/ 10

AUTHOR

Existing-Monitor-879