BACK_TO_FEEDAICRIER_2
Real-time EEG demo turns brainwaves into audiovisual performance
OPEN_SOURCE ↗
REDDIT · REDDIT// 38d agoVIDEO

Real-time EEG demo turns brainwaves into audiovisual performance

A Reddit demo shows a custom pipeline that maps live EEG signals into generative music and reactive 3D visuals using TouchDesigner, Ableton Live, and OpenBCI. The creator highlights signal features like Hjorth parameters, Shannon entropy, focus/relaxation scoring, and valence estimation.

// ANALYSIS

This is an interesting developer-grade neurotech prototype, but it is still an early experiment rather than a product launch.

  • Combines biosignal processing with real-time creative tooling in a practical live setup.
  • Uses interpretable EEG features (entropy, Hjorth, valence) instead of opaque black-box outputs.
  • Strong fit for creative coding and BCI experimentation, but no clear API, release, or commercialization yet.
// TAGS
real-time-eeg-to-audiovisual-systemmultimodalaudio-gendevtoolresearch

DISCOVERED

38d ago

2026-03-05

PUBLISHED

38d ago

2026-03-04

RELEVANCE

6/ 10

AUTHOR

uisato