The Rehabilitation Exoskeletons Based on Intention Recognition

Authors

  • Tongtian Ying

DOI:

https://doi.org/10.54097/z9t1v421

Keywords:

Rehabilitation exoskeletons, multimodal intent recognition, unimodal intent recognition.

Abstract

With the global aging population and the increasing number of stroke and gout patients, the number of individuals with mobility impairments is rising. Rehabilitation exoskeleton, as an emerging human-machine fusion assistive technology, has been widely used in recovery training for patients with neurological injuries. Motion intention recognition is the core component for achieving human-machine interaction, with its accuracy directly determining the safety and effectiveness of rehabilitation training. Driven by advancements in artificial intelligence and multimodal sensing technologies, single-modality motion intention recognition is evolving toward multimodal fusion. This paper systematically reviews the development of intent recognition for rehabilitation exoskeletons, highlighting the limitations of single-modality approaches such as electromyography and electroencephalography. Consequently, researchers have shifted focus to multimodal intent recognition. The current state of multimodal intent recognition is presented, encompassing sensory modalities, feature extraction and fusion methods, and representative application cases. It also summarizes the current difficulties and challenges in development, along with potential future trends, aiming to outline developmental trajectories and provide reference and insights for subsequent research.

Downloads

Download data is not yet available.

References

[1] Cross M, Ong K L, Culbreth G T, Steinmetz J D, Cousin E, Lenox H, ... Woolf A D. Global, regional, and national burden of gout, 1990–2020, and projections to 2050: a systematic analysis of the Global Burden of Disease Study 2021. The Lancet Rheumatology, 2024, 6: e507-e517.

[2] Bogue R. Robotic exoskeletons: a review of recent progress. Industrial Robot: An International Journal, 2015, 42: 5-10.

[3] Pei S, Wang J, Tian C, Li X, Guo B, Guo J, Yao Y. Assist-as-Needed Controller of a Rehabilitation Exoskeleton for Upper-Limb Natural Movements. Applied Sciences, 2025.

[4] Huang X, Ma T, Jia L, Zhang Y, Rong H, Alnabhan N. An effective multimodal representation and fusion method for multimodal intent recognition. Neurocomputing, 2023, 548: 126373.

[5] Zhang X, Qu Y, Zhang G, Wang Z, Chen C, Xu X. Review of sEMG for Exoskeleton Robots: Motion Intention Recognition Techniques and Applications. Sensors, 2025.

[6] Zhang W, Li J, Gao F, Yang D, Guo Y. Research progress of wearable technology in rehabilitation medicine. Chin J Rehabil Theory Pract, 2017.

[7] Al-Quraishi M S, Elamvazuthi I, Daud S A, Parasuraman S, Borboni A. EEG-based control for upper and lower limb exoskeletons and prostheses: a systematic review. Sensors, 2018, 18: 3342.

[8] Jiang Z, Liu Y. A review of human action recognition based on deep learning. 9th International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), 2024, 9.

[9] Chen W, Wang S, Zhang X, Yao L, Yue L, Qian B, Li X. EEG-based motion intention recognition via multi-task RNNs. In: Proceedings of the 2018 SIAM International Conference on Data Mining. SIAM, 2018: 279–287.

[10] Zhao J, Wen Y, Li Q, Hu M, Zhou Y, Xue J, ... Li Y. Deep Learning Approaches for Multimodal Intent Recognition: A Survey. arXiv preprint arXiv:2507.22934, 2025.

[11] Wang Y, Fang Y, Huang W, Liu Z, Zhao C, Du M, Zhu L. Development of lower limb exoskeleton rehabilitation robot framework based on multi-modal motion intent detection. IEEE Access, 2025.

[12] Langlois K, Geeroms J, Van De Velde G, Rodriguez-Guerrero C, Verstraten T, Vanderborght B, Lefeber D. Improved motion classification with an integrated multimodal exoskeleton interface. Frontiers in Neurorobotics, 2021, 15: 693110.

Downloads

Published

30-03-2026

Issue

Section

Articles

How to Cite

Ying, T. (2026). The Rehabilitation Exoskeletons Based on Intention Recognition. Academic Journal of Science and Technology, 20(2), 491-495. https://doi.org/10.54097/z9t1v421