OPEN_SOURCE ↗
REDDIT · REDDIT// 11d agoMODEL RELEASE
Liquid AI drops LFM2.5-350M for edge agents
Liquid AI has released LFM2.5-350M, a highly compact foundation model designed specifically for reliable data extraction and autonomous agent loops on edge hardware. Despite its tiny 350M parameter footprint, the model was trained on 28T tokens and utilizes a hybrid LIV architecture to outperform models twice its size in instruction following and tool-calling benchmarks while running in under 500MB of memory.
// ANALYSIS
LFM2.5-350M signals a shift toward specialized "utility" models that prioritize latency and reliability over broad general knowledge for on-device applications.
- –Hybrid LIV architecture combines convolutions and attention for linear context scaling and massive throughput exceeding 500 t/s on modern silicon
- –Optimized for extreme low-end compatibility, running efficiently on devices ranging from Raspberry Pi 5 to the latest mobile NPUs
- –Outperforms Qwen3.5-0.8B and Gemma 3 1B in BFCLv3 tool-use and IFEval benchmarks despite the significant parameter disadvantage
- –Day-0 ecosystem support for llama.cpp, MLX, and vLLM ensures immediate developer adoption across diverse hardware stacks
- –Positioned as a category-leading choice for high-frequency, low-cost tasks like log filtering, PII masking, or local routing
// TAGS
liquid-ailiquid-ai-lfm2-5-350mllmedge-aiagentopen-weights
DISCOVERED
11d ago
2026-03-31
PUBLISHED
11d ago
2026-03-31
RELEVANCE
9/ 10
AUTHOR
PauLabartaBajo