OPEN_SOURCE ↗
REDDIT · REDDIT// 36d agoPRODUCT LAUNCH
EdgeDox brings offline PDF Q&A to Android
EdgeDox is an early Android app that runs Qwen3.5-0.8B locally with the MNN inference engine so users can ask questions about PDFs, summarize documents, and extract key points without sending files to the cloud. It is a practical privacy-first demo of on-device document AI on ordinary phones, not just flagship hardware.
// ANALYSIS
EdgeDox matters less as a polished app today than as proof that useful document Q&A is starting to fit inside a phone. If the team can keep latency, memory use, and model loading under control, this kind of offline assistant could become one of the clearest wins for edge AI.
- –Running Qwen3.5-0.8B on Android with quantized weights shows mobile LLM inference is moving from novelty to usable product territory
- –Document Q&A is a strong first use case because PDFs are privacy-sensitive and users have a clear reason to avoid cloud uploads
- –MNN is a key part of the story here since efficient mobile inference frameworks matter as much as the model itself
- –The main limitation is still the classic small-model tradeoff: tighter memory budgets mean weaker long-context handling and lower answer quality on complex documents
// TAGS
edgedoxllmragedge-aidata-tools
DISCOVERED
36d ago
2026-03-07
PUBLISHED
36d ago
2026-03-07
RELEVANCE
7/ 10
AUTHOR
abuvanth