BACK_TO_FEEDAICRIER_2
MacMind trains transformer in HyperCard
OPEN_SOURCE ↗
HN · HACKER_NEWS// 3h agoOPENSOURCE RELEASE

MacMind trains transformer in HyperCard

MacMind is a 1,216-parameter, single-layer transformer written entirely in HyperTalk for HyperCard on classic Macintosh hardware. It learns the bit-reversal permutation with embeddings, attention, backpropagation, and gradient descent, and the repo includes a trained stack, a blank stack, and a Python reference implementation.

// ANALYSIS

Retro demo, real math. The platform is the hook, but the important part is that it makes transformer mechanics legible in an environment never meant for neural nets.

  • Everything is inspectable inside HyperCard, which turns a normally opaque model into something you can read and modify line by line
  • The bit-reversal task is a smart choice because it forces the model to learn positional structure instead of memorizing a shortcut
  • Saving the trained stack makes the model portable and persistent, so the weights behave like a real artifact rather than a throwaway demo
  • The project is a strong reminder that attention and backprop are hardware-agnostic math; only the scale changes
// TAGS
macmindopen-sourceembeddingresearchllm

DISCOVERED

3h ago

2026-04-16

PUBLISHED

9h ago

2026-04-16

RELEVANCE

9/ 10

AUTHOR

hammer32