BACK_TO_FEEDAICRIER_2
PocketBot runs LLM agents locally on iPhone
OPEN_SOURCE ↗
REDDIT · REDDIT// 27d agoPRODUCT LAUNCH

PocketBot runs LLM agents locally on iPhone

PocketBot is an iOS app in TestFlight beta that runs a quantized 3B model (Qwen3) via llama.cpp on Metal to convert plain English into iPhone automations — entirely on-device with no cloud. Two indie developers are sharing early progress and seeking community input on model selection, quantization, and sampling strategies.

// ANALYSIS

On-device LLM automation on iPhone is a compelling niche, but PocketBot is squarely in "interesting experiment" territory rather than a polished product launch.

  • Fully local inference at 3B scale on iPhone 15 Pro is technically impressive — Q4_K_M within the ~3-4GB iOS memory budget is the right call for now
  • The JSON tool call reliability problem (hallucinated params, malformed output) is a universal pain point at sub-4B scale, not a PocketBot-specific flaw
  • Separating sampling strategies by task type (low temp for structured output, higher for chat) is a well-established pattern the community will likely validate
  • The real constraint is iOS memory headroom — Q5_K_S may not be worth the tradeoff until 16GB iPhone hardware becomes common
  • No official website, Product Hunt listing, or GitHub repo — this is pre-launch community engagement, not a formal launch
// TAGS
pocketbotedge-aillmagentautomationios

DISCOVERED

27d ago

2026-03-16

PUBLISHED

27d ago

2026-03-16

RELEVANCE

5/ 10

AUTHOR

Least-Orange8487