OPEN_SOURCE ↗
REDDIT · REDDIT// 37d agoRESEARCH PAPER
PHP brings parkour to humanoids
Amazon FAR and UC Berkeley researchers introduced Perceptive Humanoid Parkour, a preprint framework that lets a Unitree G1 chain human-inspired vaulting, climbing, rolling, and obstacle traversal using onboard depth sensing. The standout result is not just flashy stunts, but closed-loop skill selection across longer parkour sequences, including climbs up to 1.25 meters.
// ANALYSIS
This is the kind of robotics paper that feels like a real capability jump, not just a better demo reel. PHP matters because it turns parkour from isolated scripted motions into perception-driven skill composition on a real humanoid.
- –The core trick is combining motion matching from human parkour data with RL-trained controllers, then distilling them into one depth-based multi-skill policy
- –The robot can decide when to step over, vault, climb, or roll off obstacles based on onboard sensing instead of replaying a fixed sequence
- –Real-world tests on Unitree G1 include roughly 3 m/s vaults, 1.25 m wall climbs, and minute-long traversals with obstacle perturbations
- –For robotics engineers, the bigger takeaway is long-horizon chaining and smooth transitions between skills, which is a harder problem than a single athletic maneuver
- –It is still a preprint on structured obstacle courses, so this is progress toward robust physical AI rather than proof that humanoids are ready for messy real-world jobs
// TAGS
perceptive-humanoid-parkourroboticsresearch
DISCOVERED
37d ago
2026-03-06
PUBLISHED
37d ago
2026-03-06
RELEVANCE
7/ 10
AUTHOR
callmeteji