AutoResearch tops Optuna in code-space search
Experiments on the NanoChat testbed show that AutoResearch outperforms Optuna by searching directly in code space rather than tuning parameters. AutoResearch autonomously modifies training logic and architecture to find superior solutions, bypassing the local optima that constrain traditional Bayesian optimization.
Agentic code-space search leverages an LLM's structural understanding of code to outperform traditional Bayesian hyperparameter optimization. The agent uses embedded knowledge of machine learning best practices to prune the search space and can perform macro-optimizations like changing loss functions or layer types. This approach significantly reduces GPU training time, making it a more economical choice for large-scale research and development despite higher token costs.
DISCOVERED
9d ago
2026-04-03
PUBLISHED
9d ago
2026-04-02
RELEVANCE
AUTHOR
Educational_Strain_3