Simulation Results
By taking various postures to grasp specific objects, our method demonstrates dexterous grasping capabilities. Only sparse guidance is provided to the policy, and the policy is able to adapt to the object geometry and task intent.
Dexterous manipulation requires planning a grasp configuration suited to the object and task, which is then executed through coordinated multi-finger control. However, specifying grasp plans with dense pose or contact targets for every object and task is impractical. Meanwhile, end-to-end reinforcement learning from task rewards alone lacks controllability, making it difficult for users to intervene when failures occur. To this end, we present GRIT, a two-stage framework that learns dexterous control from sparse taxonomy guidance. GRIT first predicts a taxonomy-based grasp specification from the scene and task context. Conditioned on this sparse command, a policy generates continuous finger motions that accomplish the task while preserving the intended grasp structure. Our result shows that certain grasp taxonomies are more effective for specific object geometries. By leveraging this relationship, GRIT improves generalization to novel objects over baselines and achieves an overall success rate of 87.9%. Moreover, real-world experiments demonstrate controllability, enabling grasp strategies to be adjusted through high-level taxonomy selection based on object geometry and task intent.
By taking various postures to grasp specific objects, our method demonstrates dexterous grasping capabilities. Only sparse guidance is provided to the policy, and the policy is able to adapt to the object geometry and task intent.
We demonstrate that naive RL suffers from hand pose bias and fails to achieve diverse grasp postures.
We deployed our method on the humanoid robot, Allex.
We demonstrate that our method can achieve diverse grasp postures.
@misc{park2026grit,
title={Learning Dexterous Grasping from Sparse Taxonomy Guidance},
author={Juhan Park and Taerim Yoon and Seungmin Kim and Joonggil Kim and Wontae Ye and Jeongeun Park and Yoonbyung Chai and Geonwoo Cho and Geunwoo Cho and Dohyeong Kim and Kyungjae Lee and Yongjae Kim and Sungjoon Choi},
year={2026},
eprint={2604.04138},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2604.04138},
}