OPEN_SOURCE ↗
REDDIT · REDDIT// 24d agoPRODUCT UPDATE
AI People adds local LLM support
GoodAI’s AI People update 0.3.0a adds local LLM support, letting NPC conversations run on the player’s PC instead of cloud inference. That makes the game cheaper to run and more private, but it also raises the hardware bar with a minimum 12GB of VRAM.
// ANALYSIS
This is the rare AI-game update that actually changes the economics, not just the vibe: moving inference local turns NPC chat from a recurring cloud bill into a device capability. It also nudges AI People closer to a real testbed for on-device agents, even if the hardware requirements still keep it out of reach for a lot of players.
- –GoodAI is positioning AI People as an offline-first AI sandbox, which is a much stronger story than “cloud chatbot game.”
- –Local inference can eliminate token burn for scenarios, especially if players also disable TTS and speech recognition.
- –The tradeoff is real: 8GB for the LLM alone plus audio tooling means the feature is still aimed at fairly beefy gaming PCs.
- –The update is interesting for developers because it shows how agentic NPCs, voice, and local inference can be packaged into a consumer game.
- –If GoodAI can cut the VRAM floor over time, this could become a template for more self-hosted AI gameplay loops.
// TAGS
ai-peoplellmagentinferenceself-hostedcloud
DISCOVERED
24d ago
2026-03-18
PUBLISHED
24d ago
2026-03-18
RELEVANCE
7/ 10
AUTHOR
neoexanimo