OPEN_SOURCE ↗
REDDIT · REDDIT// 3h agoMODEL RELEASE
Local Gemma 4 installations hallucinate Google hosting
Gemma 4 users report a persistent identity hallucination where local installations claim to be running on Google's cloud infrastructure. This behavioral artifact highlights the rigid pre-training biases that remain in high-density open-weights models.
// ANALYSIS
The "Google infra" hallucination is a classic training-data ghost haunting Gemma 4's local deployments. It underscores that even advanced reasoning models still rely on rigid identity markers baked into their fine-tuning data.
- –LLMs lack inherent awareness of their physical hosting environment; they predict identity based on pre-training bias and internal system instructions.
- –Gemma 4 was likely fine-tuned with data where it identified as a Google-hosted model, leading to this "unshakeable" conviction in local contexts.
- –The hallucination persists across major local runners like Ollama, LM Studio, and llama.cpp, regardless of the underlying OS or hardware.
- –High-density models like the 31B Dense variant are particularly prone to these identity artifacts due to their strict adherence to training patterns.
- –Developers should treat environmental claims from local models as purely stylistic artifacts rather than reliable system diagnostics.
// TAGS
gemma-4llmself-hostedopen-weights
DISCOVERED
3h ago
2026-04-20
PUBLISHED
7h ago
2026-04-20
RELEVANCE
9/ 10
AUTHOR
Caffdy