BACK_TO_FEEDAICRIER_2
Eka Robotics Unveils Superhuman Gripper, VFA Model
OPEN_SOURCE ↗
REDDIT · REDDIT// 3h agoPRODUCT LAUNCH

Eka Robotics Unveils Superhuman Gripper, VFA Model

Wired profiles Eka Robotics, a Cambridge startup led by MIT's Pulkit Agrawal and former DeepMind researcher Tuomas Haarnoja, after it showed a claw that can sort objects, recover from fumbles, and screw in a lightbulb. The company says its Vision-Force-Action model and touch-aware grippers could push dexterous robots beyond the usual sim-to-real ceiling.

// ANALYSIS

This is one of the more credible physical-intelligence pitches in robotics right now, but the gap between a striking demo and a deployable product is still enormous. The VFA framing is interesting because it treats touch and force as first-class signals, not just vision plus imitation data; the lightbulb and chicken-nugget demos matter because they show recovery, contact handling, and object variability, which are the hard parts. If Eka can generalize across tasks and environments, it could become a platform story for embodied AI, not just a single robot arm. The main risk is classic robotics: lab excellence that gets brittle fast once the hardware leaves controlled conditions. This sits closer to frontier robotics research than mainstream developer tooling, but it is a real signal that embodied AI is maturing.

// TAGS
roboticsmultimodalautomationresearcheka-robotics

DISCOVERED

3h ago

2026-04-29

PUBLISHED

4h ago

2026-04-29

RELEVANCE

7/ 10

AUTHOR

Recoil42