BACK_TO_FEEDAICRIER_2
MiroFish-Offline goes fully local with Neo4j, Ollama
OPEN_SOURCE ↗
YT · YOUTUBE// 21d agoOPENSOURCE RELEASE

MiroFish-Offline goes fully local with Neo4j, Ollama

MiroFish-Offline is a fully local fork of MiroFish, the multi-agent simulation engine that turns uploaded documents into large populations of AI personas and simulates how public opinion, market sentiment, and social dynamics evolve over time. This fork replaces the original cloud-dependent stack with Neo4j Community Edition for graph memory and Ollama-hosted models for both inference and embeddings, while also translating the interface into English. The result is an offline-capable system for crisis testing, policy analysis, and market-scenario exploration that runs entirely on the user’s hardware.

// ANALYSIS

This is the rare “offline” rewrite that actually changes the product’s shape, not just its deployment story.

  • It swaps Zep Cloud and DashScope/OpenAI dependencies for a local graph + local model stack, which is the real headline.
  • The engine’s appeal is less about prediction accuracy and more about generating plausible, adversarial, multi-agent narratives around an event.
  • The tradeoff is obvious: you get privacy and control, but you also inherit the hardware cost and latency of running a big local simulation.
  • The English fork and Neo4j/Ollama setup make it much more accessible to non-Chinese users who want to experiment with agent swarms.
// TAGS
multi-agentsimulationpredictionofflineollamaneo4jlocal-llmgraph-databaseswarm-intelligence

DISCOVERED

21d ago

2026-03-21

PUBLISHED

21d ago

2026-03-21

RELEVANCE

8/ 10

AUTHOR

Github Awesome