Context Federation bets on structured local memory
Context Federation is a local-first personal knowledge graph built around an MCP server, with the goal of making structured context available across Claude, Cursor, ChatGPT, and other LLMs. The product splits memory into three tiers: stable properties, provenanced facts with confidence and time, and traversable relationships. The current stack is TypeScript plus SQLite adjacency tables, with a storage-agnostic spec and local session storage, and the team says four v0.1 specs are already drafted.
Hot take: this is strongest when positioned as a trustworthy memory substrate, not just “another AI memory app.” The structured tiers and provenance model are the differentiator.
- –SQLite plus adjacency tables is the right default for v0.1: simple, portable, and easy to ship locally without graph-db operational overhead.
- –MCP is a good primary protocol for LLM-native workflows; REST is worth adding later if the project wants broader app integration beyond tool-using models.
- –Manual graph building should probably be the default, with selective auto-ingest as an opt-in, because confidence and provenance only matter if users trust the capture pipeline.
- –The session-resume story across Claude, Cursor, and ChatGPT is compelling, but it needs a concrete UX to avoid feeling like infrastructure rather than a product.
- –Main competitive gap versus Mem0, Zep, and Graphiti is not raw memory storage, but personal ownership, structured semantics, and cross-tool portability.
DISCOVERED
2h ago
2026-04-20
PUBLISHED
3h ago
2026-04-20
RELEVANCE
AUTHOR
SnooMemesjellies5137