Adam-v2 medical model weights hit GitHub
Adam-v2 is a self-supervised medical foundation model that explicitly encodes human anatomical hierarchies into its feature space. The release includes pretrained ConvNeXt-B weights for anatomy-aware dense embeddings, achieving state-of-the-art results on chest X-ray benchmarks like ChestX-ray14.
Adam-v2 shifts medical AI from generic feature extraction to specialized hierarchical understanding, moving beyond simple image recognition toward structural anatomical awareness. Its three-branch architecture mirrors biological reality by understanding how anatomical parts form a whole, while pretrained ConvNeXt-B weights enable high-performance feature extraction for anomaly detection. Superior zero-shot transfer capabilities make it a versatile foundation for niche medical imaging, and the public availability of weights in the Eden repository accelerates development of anatomy-aware systems.
DISCOVERED
14d ago
2026-03-28
PUBLISHED
16d ago
2026-03-26
RELEVANCE
AUTHOR
Typical-Owl1014