BACK_TO_FEEDAICRIER_2
Autoresearch fork lands on Modal H100s
OPEN_SOURCE ↗
REDDIT · REDDIT// 21d agoINFRASTRUCTURE

Autoresearch fork lands on Modal H100s

A fork of Karpathy's autoresearch ports the autonomous 5-minute training loop to Modal's serverless H100s, so experiments can run without a local GPU or CUDA setup. The author says each run costs about $0.32, cold starts are around 2 seconds, and data persists in Modal volumes.

// ANALYSIS

This is less about a new training idea than about making autonomous research loops cheap, repeatable, and easy enough to run overnight. Once GPU provisioning disappears, the bottleneck shifts to prompt quality, the experiment loop, and the metric, while the tiny mutable surface, fixed 5-minute budget, persistent Modal volumes, and reported $0.32 runs with ~2-second cold starts make the workflow practical.

// TAGS
autoresearchagentgpucloudautomationmlopsopen-source

DISCOVERED

21d ago

2026-03-22

PUBLISHED

21d ago

2026-03-22

RELEVANCE

8/ 10

AUTHOR

Ready-Interest-1024