OPEN_SOURCE ↗
REDDIT · REDDIT// 18d agoOPENSOURCE RELEASE
llmLibrarian brings cited local search to MCP
llmLibrarian is a local-first RAG engine that indexes chosen folders into ChromaDB silos and exposes them over MCP, so clients like Claude can pull cited chunks or ask Ollama for a synthesized answer. The big idea is that separate silos can be combined, letting journals, codebases, and archives act like one grounded memory layer.
// ANALYSIS
Hot take: the silo abstraction is the real moat here, not just another local search wrapper. If the metadata layer stays clean, this could feel more like a personal knowledge OS than a demo.
- –`retrieve` and `retrieve_bulk` are the right primitives for an MCP-native system because they keep evidence visible instead of hiding it behind a brittle single-answer prompt.
- –`ask` is a useful convenience layer, but the architecture is strongest when the same retrieval pipeline can serve both raw chunks and synthesized answers.
- –Cross-silo queries are where this gets interesting: once notes, code, and docs share a retrieval space, the system can surface patterns a single folder would miss.
- –The hardest part will be operational hygiene, especially multi-silo tagging and reindexing consistency in ChromaDB, which is where local knowledge tools often decay over time.
// TAGS
llm-librarianmcpragvector-dbself-hostedopen-sourcesearch
DISCOVERED
18d ago
2026-03-25
PUBLISHED
18d ago
2026-03-25
RELEVANCE
8/ 10
AUTHOR
Novel_Somewhere_2171