OPEN_SOURCE ↗
REDDIT · REDDIT// 22d agoPRODUCT LAUNCH
neuropt tunes hyperparameters from full training curves
neuropt is an open-source hyperparameter optimization package that sends per-epoch training and validation curves to an LLM after each trial, then uses that reasoning to propose the next configuration. It supports PyTorch, XGBoost, and scikit-learn, auto-detects tunable PyTorch parameters and layers, and claims small-budget benchmark wins over Optuna TPE and random search on FashionMNIST and Covertype.
// ANALYSIS
This is a smart and timely idea, especially for expensive training runs where the learning curve tells you far more than the last metric ever will.
- –The curve-aware loop is the real differentiator here; it should be most useful when early stopping signals, instability, or wasted epochs matter.
- –Auto-detecting tunables in PyTorch lowers adoption friction a lot, which is often what decides whether a tool gets tried at all.
- –The benchmark claim is interesting, but I’d want to see how much of the lift comes from the LLM vs. from simply having richer signals and a better trial-selection workflow.
- –Main risk: prompt variance and reproducibility. If the suggestions are sensitive to wording or model choice, it may be harder to trust in serious tuning workflows.
// TAGS
hyperparameter optimizationllmpytorchxgboostscikit-learnmachine learningopen source
DISCOVERED
22d ago
2026-03-21
PUBLISHED
22d ago
2026-03-20
RELEVANCE
8/ 10
AUTHOR
dloevlie