OPEN_SOURCE ↗
REDDIT · REDDIT// 17d agoINFRASTRUCTURE
Apple M5 Max, dual M4 minis spark debate
A Reddit user shows off a local-AI desk built around an M5 Max 128GB MacBook Pro and two M4 Mac minis, asking what others would run on similar hardware. The thread reads less like a launch and more like a snapshot of how far Apple silicon has pushed hobbyist inference.
// ANALYSIS
This is a status check on the local-AI arms race: Apple silicon buys quiet, efficient, memory-rich boxes, while the usual counterargument is still "just get the biggest Nvidia card you can afford."
- –128GB unified memory is the real enabler here, because it makes larger quantized models and longer-context agents practical on a single machine.
- –The dual M4 Mac minis hint at a home-lab approach, but Apple stacks are still about flexibility and power draw more than linear speed scaling.
- –Apple is clearly marketing M5 Max MacBook Pro around on-device AI and local LLMs, so this setup fits the direction the platform is heading.
- –The top reply captures the tradeoff perfectly: portability and silence versus raw token throughput and better price/perf from discrete GPUs.
// TAGS
llminferenceself-hostededge-aiagentmacbook-promac-mini
DISCOVERED
17d ago
2026-03-25
PUBLISHED
17d ago
2026-03-25
RELEVANCE
5/ 10
AUTHOR
Excellent_Koala769