Microsoft PhotoDNA flags user face as illegal material
A user has reported that Microsoft's PhotoDNA perceptual hashing system is incorrectly identifying his facial profile picture as Child Sexual Abuse Material (CSAM), leading to the immediate and permanent termination of over a dozen Microsoft accounts. The incident highlights the "cursed hash" problem, where a benign image becomes a permanent blacklist signature across the internet with no clear path to human-led appeal or account restoration.
This case illustrates the failure of automated safety systems that prioritize enforcement over user recourse, effectively exiling individuals from essential digital ecosystems. Perceptual hashing prevents bypassing bans with minor edits, while the lack of human oversight leaves users with no path to appeal. Furthermore, the integration of these scans into local hardware raises significant concerns regarding client-side surveillance and the intersection of automated reporting with law enforcement.
DISCOVERED
2d ago
2026-04-10
PUBLISHED
2d ago
2026-04-10
RELEVANCE
AUTHOR
darkzek