Back to Explorer
Research PaperResearchia:202601.29095[Robotics > Robotics]

DexTac: Learning Contact-aware Visuotactile Policies via Hand-by-hand Teaching

Xingyu Zhang

Abstract

For contact-intensive tasks, the ability to generate policies that produce comprehensive tactile-aware motions is essential. However, existing data collection and skill learning systems for dexterous manipulation often suffer from low-dimensional tactile information. To address this limitation, we propose DexTac, a visuo-tactile manipulation learning framework based on kinesthetic teaching. DexTac captures multi-dimensional tactile data-including contact force distributions and spatial contact regions-directly from human demonstrations. By integrating these rich tactile modalities into a policy network, the resulting contact-aware agent enables a dexterous hand to autonomously select and maintain optimal contact regions during complex interactions. We evaluate our framework on a challenging unimanual injection task. Experimental results demonstrate that DexTac achieves a 91.67% success rate. Notably, in high-precision scenarios involving small-scale syringes, our approach outperforms force-only baselines by 31.67%. These results underscore that learning multi-dimensional tactile priors from human demonstrations is critical for achieving robust, human-like dexterous manipulation in contact-rich environments.


Source: arXiv:2601.21474v1 - http://arxiv.org/abs/2601.21474v1 PDF: https://arxiv.org/pdf/2601.21474v1 Original Link: http://arxiv.org/abs/2601.21474v1

Submission:1/29/2026
Comments:0 comments
Subjects:Robotics; Robotics
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

DexTac: Learning Contact-aware Visuotactile Policies via Hand-by-hand Teaching | Researchia