Multimodal Perception Technology, Fusion, and Application of Robot Dexterous Hands for Complex Tasks in Intelligent Manufacturing
DOI:
https://doi.org/10.54097/b88jb554Keywords:
Robotic dexterous hand, Multimodal perception, Information fusion, Smart manufacturing, Pose estimation.Abstract
To meet the flexible production line requirements of smart manufacturing characterized by variety, small batches, and sub-millimeter precision, traditional single-mode vision solutions suffer from large pose errors and slow production changes in scenarios involving occlusion, reflection, weak textures, and flexible objects. This paper systematically reviews the technology, fusion, and application progress of multimodal perception (visual, tactile, and force) for robotic dexterous hands. First, it outlines the principles of vision and high-resolution tactile sensing, as well as six-dimensional force/torque sensing. Subsequently, it proposes a task-oriented framework, comparing the advantages and disadvantages of visual-guided tactile verification serial strategies versus end-to-end joint modeling in 6D pose estimation. It summarizes the millisecond-level closed-loop effectiveness of slip detection-grip force regulation and near-range tactile collaboration in stable grasping and dexterous operations. Through case studies of two typical production lines—high-precision assembly and flexible object manipulation—this paper identifies future breakthrough directions: low-cost tactile sensors, few-shot cross-modal alignment, production-line-level datasets, and interpretable fusion frameworks. The aim is to provide a methodology and a roadmap for the transition of dexterous hands from laboratory settings to standard batch production.
Downloads
References
[1] Zhang J, Zhao H, Chen K, Fei G, Li X, Wang Y, Yang Z, Zheng S, Liu S, Ding H. Dexterous hand towards intelligent manufacturing: A review. Robotics and Computer-Integrated Manufacturing, 2025, 88: 103021.
[2] Li Z, Wang K, Wang B, Zhao Z, Li Y, Guo Y, Hu Y, Wang H, Lü P, Xu M. Human-machine fusion intelligent decision-making: Concepts, frameworks, and applications. Journal of Electronics & Information Technology, 2025, 47(10): 3439–3464.
[3] Tu Y, Jiang J, Li S, Hendrich N, Li M, Zhang J. PoseFusion: Robust object-in-hand pose estimation with SelectLSTM. 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 2023, 6839–6846.
[4] Tong T H, Liang D T, Xie L T. Design of a novel optical tactile sensor based on tactile perception. Journal of Electronic Measurement and Instrumentation, 2025, 39(2): 185–192.
[5] Xu S, Liu T, Wong M, Kulic D, Cosgun A. Rotating objects via in-hand pivoting using vision, force and touch. arXiv preprint arXiv:2303.10865, 2023.
[6] James J W, Lepora N F. Slip detection for grasp stabilisation with a multi-fingered tactile robot hand. arXiv preprint arXiv:2010.01928, 2020.
[7] Li Y, Zhao H, Shephard J, et al. GNN-based visuo-tactile tracking of deformable cables. IEEE Robotics and Automation Letters, 2024, 9(4): 3102–3109.
[8] Zhang Y, Wang Z, Yang Y, et al. High-precision 6D pose estimation for reflective industrial parts using hybrid fringe projection and tactile verification. IEEE Transactions on Industrial Informatics, 2024, 20(3): 2341–2351.
[9] Liu H, Zhang X, Gao Y, et al. Tactile-guided force-controlled insertion strategy for USB-Type-C connectors in smartphone assembly. Robotics and Computer-Integrated Manufacturing, 2025, 89: 102689.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Academic Journal of Science and Technology

This work is licensed under a Creative Commons Attribution 4.0 International License.








