Humanoid dexterous hands from structure to gesture semantics for enhanced human-robot interaction: A review
Xin Li , Wenfu Xu , Zaiqiao Ye , Han Yuan
Biomimetic Intelligence and Robotics ›› 2025, Vol. 5 ›› Issue (4) : 100258
As human–robot interaction (HRI) technology advances, dexterous robotic hands are playing a dual role—serving both as tools for manipulation and as channels for non-verbal communication. While much of the existing research emphasizes improving grasping and structural dexterity, the semantic dimension of gestures and its impact on user experience has been relatively overlooked. Studies from HRI and cognitive psychology consistently show that the naturalness and cognitive empathy of gestures significantly influence user trust, satisfaction, and engagement. This shift reflects a broader transition from mechanically driven designs toward cognitively empathic interactions — robots’ ability to infer human affect, intent, and social context to generate appropriate nonverbal responses. In this paper, we argue that large language models (LLMs) enable a paradigm shift in gesture control — from rule-based execution to semantic-driven, context-aware generation. By leveraging LLMs and visual-language models, robots can interpret environmental and social cues, dynamically map emotions, and generate gestures aligned with human communication norms. We conducted a comprehensive review of research in dexterous hand mechanics, gesture semantics, and user experience evaluation, integrating insights from linguistics and cognitive science. Furthermore, we propose a closed-loop framework — “perception–cognition–generation–assessment” — to guide gesture design through iterative, multimodal feedback. This framework lays the conceptual foundation for building universal, adaptive, and emotionally intelligent gesture systems in future human–robot interaction.
Human–robot interaction (HRI) / Dexterous hand / Large language models / Gesture / Communication
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
Shadow dexterous hand e1 series, 2013, http://www.shadowrobot.com/wpcontent/uploads/shadowdexteroushandtechnicalspecificationE120130101.pdf/. |
| [9] |
|
| [10] |
Park, integrated linkage-driven dexterous anthropomorphic robotic hand, Nat. Commun. 12 (1) (2021) 1-13. |
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
| [51] |
|
| [52] |
|
| [53] |
|
| [54] |
|
| [55] |
|
| [56] |
|
| [57] |
|
| [58] |
|
| [59] |
|
| [60] |
|
| [61] |
|
| [62] |
|
| [63] |
|
| [64] |
|
| [65] |
|
| [66] |
|
| [67] |
|
| [68] |
|
| [69] |
|
| [70] |
|
| [71] |
|
| [72] |
|
| [73] |
|
| [74] |
|
| [75] |
|
| [76] |
|
| [77] |
|
| [78] |
|
| [79] |
|
| [80] |
|
| [81] |
RealDex: Towards Human-like Grasping for Robotic Dexterous Hand. |
| [82] |
Knowledge Augmentation and Task Planning in Large Language Models for Dexterous Grasping. |
| [83] |
Language-Guided Dexterous Functional Grasping by LLM Generated Grasp Functionality and Synergy for Humanoid Manipulation. |
| [84] |
|
| [85] |
|
| [86] |
|
| [87] |
|
| [88] |
|
| [89] |
|
| [90] |
|
| [91] |
|
| [92] |
|
| [93] |
|
| [94] |
|
| [95] |
|
| [96] |
|
| [97] |
|
| [98] |
|
| [99] |
|
| [100] |
|
| [101] |
|
| [102] |
|
| [103] |
|
| [104] |
|
| [105] |
|
| [106] |
|
| [107] |
|
| [108] |
|
| [109] |
|
| [110] |
|
| [111] |
|
| [112] |
|
| [113] |
|
| [114] |
|
| [115] |
|
| [116] |
|
| [117] |
|
| [118] |
|
| [119] |
|
| [120] |
|
| [121] |
|
| [122] |
|
| [123] |
|
| [124] |
|
/
| 〈 |
|
〉 |