BACK_TO_FEEDAICRIER_2
BDH fast-weight write-back lands open source
OPEN_SOURCE ↗
REDDIT · REDDIT// 14d agoOPENSOURCE RELEASE

BDH fast-weight write-back lands open source

An independent researcher released bdh-fast-weights, an Apache-licensed implementation of BDH's missing Hebbian fast-weight write-back path. On synthetic n-back recall, it lifts the mechanism from chance to near-perfect accuracy and adds selective fast-to-slow consolidation that preserves most of the signal.

// ANALYSIS

This is less a flashy launch than proof that BDH's inference-time plasticity can actually persist, and the selective consolidation result is the part worth watching. The catch is that it is still a synthetic-memory demo, so the natural-language question remains open.

  • The repo closes a real gap in the public BDH code: the Hebbian co-activation was computed before, but it was not written back into weights.
  • Sparse activation codes act as stable addresses, so the same token maps to the same fast-weight slot regardless of position.
  • The best confirmation run reaches 99.0% / 98.0% / 97.5% on n2 / n4 / n8, while the no-write-back baseline sits at 1.0% on n8.
  • Dense consolidation drops to 75.4% / 68.1% / 89.8%, but `rowtop10` preserves most of the control signal at 97.5% / 97.1% / 96.2%.
  • Independent H100 verification and counter-benchmarks in the 91-95% range make the result look reproducible rather than seed-chasing.
  • It is still a 25M-parameter benchmark model, so the real proof point is whether the same trick survives natural-language data.
// TAGS
bdh-fast-weightsresearchopen-sourcellminferencebenchmark

DISCOVERED

14d ago

2026-03-29

PUBLISHED

14d ago

2026-03-29

RELEVANCE

8/ 10

AUTHOR

fleebrun83