Motion intention recognition using surface electromyography and arrayed flexible thin-film pressure sensors

Lingyu BU , Xiangguo YIN , Mingxing LIN , Jiahe LIU

Journal of Measurement Science and Instrumentation ›› 2025, Vol. 16 ›› Issue (4) : 486 -497.

PDF (3218KB)
Journal of Measurement Science and Instrumentation ›› 2025, Vol. 16 ›› Issue (4) :486 -497. DOI: 10.62756/jmsi.1674-8042.2025047
Special topic on smart sensing technologies for human physiology recognition
research-article

Motion intention recognition using surface electromyography and arrayed flexible thin-film pressure sensors

Author information +
History +
PDF (3218KB)

Abstract

Motion intention recognition is considered the key technology for enhancing the training effectiveness of upper limb rehabilitation robots for stroke patients, but traditional recognition systems are difficult to simultaneously balance real-time performance and reliability. To achieve real-time and accurate upper limb motion intention recognition, a multi-modal fusion method based on surface electromyography (sEMG) signals and arrayed flexible thin-film pressure (AFTFP) sensors was proposed. Through experimental tests on 10 healthy subjects (5 males and 5 females, age 23±2 years), sEMG signals and human-machine interaction force (HMIF) signals were collected during elbow flexion, extension, and shoulder internal and external rotation. The AFTFP signals based on dynamic calibration compensation and the sEMG signals were processed for feature extraction and fusion, and the recognition performance of single signals and fused signals was compared using a support vector machine (SVM). The experimental results showed that the sEMG signals consistently appeared 175±25 ms earlier than the HMIF signals (p<0.01, paired t-test). In offline conditions, the recognition accuracy of the fused signals exceeded 99.77% across different time windows. Under a 0.1 s time window, the real-time recognition accuracy of the fused signals was 14.1% higher than that of the single sEMG signal, and the system’s end-to-end delay was reduced to less than 100 ms. The AFTFP sensor is applied to motion intention recognition for the first time. And its low-cost, high-density array design provided an innovative solution for rehabilitation robots. The findings demonstrate that the AFTFP sensor adopted in this study effectively enhances intention recognition performance. The fusion of its output HMIF signals with sEMG signals combines the advantages of both modalities, enabling real-time and accurate motion intention recognition. This provides efficient command output for human-machine interaction in scenarios such as stroke rehabilitation.

Keywords

upper limb rehabilitation robot / motion intention recognition / sEMG signal / arrayed flexible thin-film pressure sensor / human-machine interaction force

Cite this article

Download citation ▾
Lingyu BU, Xiangguo YIN, Mingxing LIN, Jiahe LIU. Motion intention recognition using surface electromyography and arrayed flexible thin-film pressure sensors. Journal of Measurement Science and Instrumentation, 2025, 16(4): 486-497 DOI:10.62756/jmsi.1674-8042.2025047

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Report on Stroke Center in China Writing Group. Brief report on stroke center in China, 2022. Chinese Journal of Cerebrovascular Diseases, 2024, 21(8): 565-576.

[2]

SHEN S Y, CHU T C, WANG J, et al. Progress in the application of motor imagery therapy in upper limb motor function rehabilitation of stroke patients with hemiplegia. Frontiers in Neurology, 2025, 16: 1454499.

[3]

ANWER S, WARIS A, GILANI S O, et al. Rehabilitation of upper limb motor impairment in stroke: a narrative review on the prevalence, risk factors, and economic statistics of stroke and state of the art therapies. Healthcare, 2022, 10(2): 190.

[4]

TANG Q Q, YANG X Y, SUN M M, et al. Research trends and hotspots of post-stroke upper limb dysfunction: a bibliometric and visualization analysis. Frontiers in Neurology, 2024, 15: 1449729.

[5]

WANG D, WANG J, ZHAO H B, et al. The relationship between the prefrontal cortex and limb motor function in stroke: a study based on resting-state functional near-infrared spectroscopy. Brain Research, 2023, 1805: 148269.

[6]

SULLIVAN J L, BHAGAT N A, YOZBATIRAN N, et al. Improving robotic stroke rehabilitation by incorporating neural intent detection: Preliminary results from a clinical trial//2017 International Conference on Rehabilitation Robotics, July 17-20, 2017, London, UK. New York: IEEE, 2017: 122-127.

[7]

YANG Y M. Research on active-passive training control strategies for upper limb rehabilitation robot. Machines, 2024, 12(11): 784.

[8]

CHENG L, XIA X Z. A survey of intelligent control of upper limb rehabilitation exoskeleton. Robot, 2022, 44(6): 750-768.

[9]

LUO S L, MENG Q L, LI S J, et al. Research of intent recognition in rehabilitation robots: a systematic review. Disability and Rehabilitation: Assistive Technology, 2024, 19(4): 1307-1318.

[10]

MIAO M D, GAO X S, ZHAO J, et al. Rehabilitation robot following motion control algorithm based on human behavior intention. Applied Intelligence, 2023, 53(6): 6324-6343.

[11]

AI Q S, LIU Z M, MENG W, et al. Machine learning in robot-assisted upper limb rehabilitation: a focused review. IEEE Transactions on Cognitive and Developmental Systems, 2023, 15(4): 2053-2063.

[12]

SONG Z Q, ZHAO P, WU X J, et al. An active control method for a lower limb rehabilitation robot with human motion intention recognition. Sensors, 2025, 25(3): 713.

[13]

SONG T, ZHANG K P, YAN Z, et al. Research on upper limb motion intention classification and rehabilitation robot control based on sEMG. Sensors, 2025, 25(4): 1057.

[14]

SUN Z B, ZHANG X, LIU K P, et al. A multi-joint continuous motion estimation method of lower limb using least squares support vector machine and zeroing neural network based on sEMG signals. Neural Processing Letters, 2023, 55(3): 2867-2884.

[15]

FAN J H, JIANG M Z, LIN C, et al. Improving sEMG-based motion intention recognition for upper-limb amputees using transfer learning. Neural Computing and Applications, 2023, 35(22): 16101-16111.

[16]

ZHU Y X, LI X, WANG J H, et al. A multi-scale temporal convolutional network-based method for sEMG upper limb motion intention recognition//2022 5th International Conference on Intelligent Robotics and Control Engineering, September 23-25, 2022, Tianjin, China. New York: IEEE, 2022: 113-117.

[17]

WANG F Y, ZHANG D H, LI Z Y, et al. Method for sEMG-based motion recognition for patients at different brunnstrom stages. Robot, 2020, 42(6): 661-671.

[18]

LOBOV S, KRILOVA N, KASTALSKIY I, et al. Latent factors limiting the performance of sEMG-interfaces. Sensors, 2018, 18(4): 1122.

[19]

YANG X, WANG J C, GAO C, et al. Research on human machine interaction of exoskeleton. Advances in Engineering Technology Research, 2023, 8(1): 859.

[20]

ZHAO Z R, LI X, LIU M F, et al. A novel human-robot interface based on soft skin sensor designed for the upper-limb exoskeleton. Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, 2022, 236(1): 566-578.

[21]

WANG W D, LI H H, XIAO M H, et al. Design and verification of a human-robot interaction system for upper limb exoskeleton rehabilitation. Medical Engineering & Physics, 2020, 79: 19-25.

[22]

FENG Y F, WANG H B, VLADAREANU L, et al. New motion intention acquisition method of lower limb rehabilitation robot based on static torque sensors. Sensors, 2019, 19(15): 3439.

[23]

LI G N, TAO L, MENG J Y, et al. Research on mode adjustment control strategy of upper limb rehabilitation robot based on fuzzy recognition of interaction force. Journal of Biomedical Engineering, 2024, 41(1): 90-97.

[24]

LI K X, ZHANG J H, WANG L F, et al. A review of the key technologies for sEMG-based human-robot interaction systems. Biomedical Signal Processing and Control, 2020, 62: 102074.

[25]

CHOI A, HYONG K, CHAE S, et al. Improved transfer learning for detecting upper-limb movement intention using mechanical sensors in an exoskeletal rehabilitation system. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2024, 32: 3953-3965.

[26]

PAN Y, CHEN C J, ZHAO Z X, et al. Robot teaching system based on hand-robot contact state detection and motion intention recognition. Robotics and Computer-Integrated Manufacturing, 2023, 81: 102492.

[27]

LÓPEZ MOLINA J A. Motion intention estimation using sEMG-ACC sensor fusion-proQuest. Canada: The University of Western Ontario, 2020.

[28]

FANG Y F, LU H Q, LIU H. Multi-modality deep forest for hand motion recognition via fusing sEMG and acceleration signals. International Journal of Machine Learning and Cybernetics, 2023, 14(4): 1119-1131.

[29]

SIMÃO M, MENDES N, GIBARU O, et al. A review on electromyography decoding and pattern recognition for human-machine interaction. IEEE Access, 2019, 7: 39564-39582.

[30]

CLANCY E A, MORIN E L, HAJIAN G, et al. Tutorial. Surface electromyogram (sEMG) amplitude estimation: Best practices. Journal of Electromyography and Kinesiology, 2023, 72: 102807.

[31]

BAN C Q, LIN M X, CHEN L, et al. Research on information acquisition and preprocessing of array force sensor//2018 15th International Conference on Ubiquitous Robots, June 26-30, 2018, Honolulu, HI, USA. New York: IEEE, 2018: 83-87.

[32]

CHEN Z W, TAN J X, LIU M H. Using EEG to decode upper limb movement: elbow flexion or extension//2023 2nd International Conference on Data Analytics, Computing and Artificial Intelligence, October 17-19, 2023, Zakopane, Poland. New York: IEEE, 2023: 795-800.

[33]

LESERRI D, GRIMMELSMANN N, MECHTENBERG M, et al. Evaluation of sEMG signal features and segmentation parameters for limb movement prediction using a feedforward neural network. Mathematics, 2022, 10(6): 932.

[34]

YADAV S, KUMAR SAHA S, KAR R. Design of robust adaptive Volterra noise mitigation architecture for sEMG signals using metaheuristic approach. Expert Systems with Applications, 2023, 221: 119732.

[35]

COSTA-GARCÍA Á, ITKONEN M, YAMASAKI H, et al. A novel approach to the segmentation of sEMG data based on the activation and deactivation of muscle synergies during movement. IEEE Robotics and Automation Letters, 2018, 3(3): 1972-1977.

[36]

SHI X, QIN P J, ZHU J Q, et al. Feature extraction and classification of lower limb motion based on sEMG signals. IEEE Access, 2020, 8: 132882-132892.

PDF (3218KB)

124

Accesses

0

Citation

Detail

Sections
Recommended

/