BACK_TO_FEEDAICRIER_2
ibu-boost applies absolute-threshold rejection to GBDT split selection
OPEN_SOURCE ↗
REDDIT · REDDIT// 1d agoOPENSOURCE RELEASE

ibu-boost applies absolute-threshold rejection to GBDT split selection

ibu-boost is an early open-source gradient-boosted tree library that reworks split choice around the screening transform from “Screening Is Enough,” rejecting weak candidates via an absolute threshold instead of always taking the best-ranked split. It supports both oblivious and non-oblivious trees, MSE and binary log-loss boosting, learned missing-value default directions, Triton GPU kernels, and built-in screening diagnostics plus parameter search. The launch post includes a California Housing benchmark showing it trails LightGBM on RMSE but runs faster on GPU than the CPU reference, making the project feel more like a researchy systems experiment than a finished replacement for mainstream GBDT libraries.

// ANALYSIS

Clever idea, but the real question is not whether the transform is elegant; it is whether it improves out-of-sample behavior enough to justify replacing a mature, well-understood split heuristic.

  • The strongest part is the framing: absolute rejection is a cleaner abstraction than `min_gain_to_split`, and the `accept_rate` diagnostic is a useful sanity check.
  • The current benchmark suggests the method is not a drop-in winner on clean tabular data; the interesting claim will be whether it helps more on noisy, high-dimensional, or sparse problems.
  • Fixed `s_w` and `s_r` are still a tuning surface, so the tuning burden is not eliminated yet, just moved.
  • The Triton work is the most practically compelling piece: a fused histogram/screening path and on-device normalization are the kinds of optimizations that can make an alpha project usable.
  • The non-deterministic atomic histogram scatter is a real tradeoff; if reproducibility matters, the next step is likely a two-pass or block-local accumulation design.
  • Overall, this reads as a solid research prototype with an interesting hypothesis, not as a proven alternative to LightGBM or XGBoost yet.
// TAGS
gbdtgradient-boostingdecision-treestritongputabular-mlresearch

DISCOVERED

1d ago

2026-04-10

PUBLISHED

1d ago

2026-04-10

RELEVANCE

8/ 10

AUTHOR

Pleasant_Yard_8879