BACK_TO_FEEDAICRIER_2
textrm-28M-bizmail stretches tiny email generation
OPEN_SOURCE ↗
REDDIT · REDDIT// 23d agoMODEL RELEASE

textrm-28M-bizmail stretches tiny email generation

Kamisori-daijin released textrm-28M-bizmail, a 28.19M-parameter TRM-based model trained on synthetic business-email data. It can sometimes produce coherent email-like text, but it still struggles with instruction-following and consistent tone.

// ANALYSIS

This is a neat proof that small models can learn the shape of business writing faster than they learn the intent behind it. The interesting part is not that it writes perfect emails, but that it gets close enough to expose where controllability starts to break down.

  • Synthetic data seems to buy the model decent surface fluency, but not reliable refusal, follow-up, or role adherence.
  • The release is more compelling as a research artifact than a production tool, since the author already flags weak instruction following and mixed outputs.
  • For small-model work, the next gains likely come from tighter prompt formatting, intent-labeled curricula, and more explicit contrastive examples for email actions.
  • The project is a useful reminder that coherence and usefulness are different milestones, especially under tight parameter budgets.
  • If the goal is structured generation, this looks like a promising base for specialization, not a general-purpose writer.
// TAGS
llmresearchopen-sourcetextrm-28m-bizmail

DISCOVERED

23d ago

2026-03-20

PUBLISHED

23d ago

2026-03-20

RELEVANCE

7/ 10

AUTHOR

AdhesivenessSea9511