Multimodal hand/finger movement sensing and fuzzy encoding for data-efficient universal sign language recognition

Caise Wei , Shiqiang Liu , Jinfeng Yuan , Rong Zhu

InfoMat ›› 2025, Vol. 7 ›› Issue (4) : e12642

PDF
InfoMat ›› 2025, Vol. 7 ›› Issue (4) :e12642 DOI: 10.1002/inf2.12642
RESEARCH ARTICLE

Multimodal hand/finger movement sensing and fuzzy encoding for data-efficient universal sign language recognition

Author information +
History +
PDF

Abstract

Wearable sign language recognition helps hearing/speech impaired people communicate with non-signers. However current technologies still unsatisfy practical uses due to the limitations of sensing and decoding capabilities. Here, A continuous sign language recognition system is proposed with multimodal hand/finger movement sensing and fuzzy encoding, trained with small word-level samples from one user, but applicable to sentence-level language recognition for new untrained users, achieving data-efficient universal recognition. A stretchable fabric strain sensor is developed by printing conductive poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) (PEDOT:PSS) ink on a pre-stretched fabric wrapping rubber band, allowing the strain sensor with superior performances of wide sensing range, high sensitivity, good linearity, fast dynamic response, low hysteresis, and good long-term reliability. A flexible e-skin with a homemade micro-flow sensor array is further developed to accurately capture three-dimensional hand movements. Benefitting from fabric strain sensors for finger movement sensing, micro-flow sensor array for 3D hand movement sensing, and human-inspired fuzzy encoding for semantic comprehension, sign language is captured accurately without the interferences from individual action differences. Experiment results show that the semantic comprehension accuracy reaches 99.7% and 95%, respectively, in recognizing 100 isolated words and 50 sentences for a trained user, and achieves 80% in recognizing 50 sentences for new untrained users.

Keywords

fuzzy encoding / micro-flow sensors / sign language recognition / stretchable strain sensors / valley segmentation

Cite this article

Download citation ▾
Caise Wei, Shiqiang Liu, Jinfeng Yuan, Rong Zhu. Multimodal hand/finger movement sensing and fuzzy encoding for data-efficient universal sign language recognition. InfoMat, 2025, 7(4): e12642 DOI:10.1002/inf2.12642

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Jiang S, Kang P, Song X, Lo B, Shull P. Emerging wearable interfaces and algorithms for hand gesture recognition: a survey. IEEE Rev Biomed Eng. 2022; 15: 85-102.

[2]

Amin MS, Rizvi STH, Hossain MM. A comparative review on applications of different sensors for sign language recognition. J Imaging. 2022; 8(4): 98.

[3]

Athira PK, Sruthi CJ, Lijiya A. A signer independent sign language recognition with co-articulation elimination from live videos: an Indian scenario. J King Saud Univ Comput Inf Sci. 2022; 34(3): 771-781.

[4]

Huang S, Ye Z. Boundary-adaptive encoder with attention method for Chinese sign language recognition. IEEE Access. 2021; 9: 70948-70960.

[5]

Papastratis I, Dimitropoulos K, Konstantinidis D, Daras P. Continuous sign language recognition through cross-modal alignment of video and text embeddings in a joint-latent space. IEEE Access. 2020; 8: 91170-91180.

[6]

Skaria S, Al-Hourani A, Evans RJ. Deep-learning methods for hand-gesture recognition using ultra-wideband radar. IEEE Access. 2020; 8: 203580-203590.

[7]

Hazra S, Santra A. Radar gesture recognition system in presence of interference using self-attention neural network. 18th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE; 2019: 1409-1414.

[8]

Kudrinko K, Flavin E, Zhu X, Li Q. Wearable sensor-based sign language recognition: a comprehensive review. IEEE Rev Biomed Eng. 2021; 14: 82-97.

[9]

Mummadi C, Leo F, Verma K, et al. Real-time and embedded detection of hand gestures with an IMU-based glove. Informatics. 2018; 5(2): 28.

[10]

Wang M, Yan Z, Wang T, et al. Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors. Nat Electron. 2020; 3(9): 563-570.

[11]

Wadhawan A, Kumar P. Deep learning-based sign language recognition system for static signs. Neural Comput & Applic. 2020; 32(12): 7957-7968.

[12]

Zhang S, Meng W, Li H, Cui X. Multimodal spatiotemporal networks for sign language recognition. IEEE Access. 2019; 7: 180270-180280.

[13]

Ibrahim NB, Selim MM, Zayed HH. An automatic Arabic sign language recognition system (ArSLRS). J King Saud Univ Comput Inf Sci. 2018; 30(4): 470-477.

[14]

Pan W, Zhang X, Ye Z. Attention-based sign language recognition network utilizing keyframe sampling and skeletal features. IEEE Access. 2020; 8: 215592-215602.

[15]

Huang S, Mao C, Tao J, Ye Z. A novel Chinese sign language recognition method based on keyframe-centered clips. IEEE Signal Process Lett. 2018; 25(3): 442-446.

[16]

Li B, Yang J, Yang Y, Li C, Zhang Y. Sign language/gesture recognition based on cumulative distribution density features using UWB radar. IEEE Trans Instrum Meas. 2021; 70: 1-13.

[17]

Kim KK, Ha I, Kim M, et al. A deep-learned skin sensor decoding the epicentral human motions. Nat Commun. 2020; 11(1): 2149.

[18]

Gong Q, Jiang X, Liu Y, Yu M, Hu Y. A flexible wireless sEMG system for wearable muscle strength and fatigue monitoring in real time. Adv Electron Mater. 2023; 9(9): 2200916.

[19]

Zhong B, Qin X, Xu H, et al. Interindividual- and blood-correlated sweat phenylalanine multimodal analytical biochips for tracking exercise metabolism. Nat Commun. 2024; 15(1): 624.

[20]

Qin X, Zhong B, Lv S, et al. A zero-voltage-writing artificial nervous system based on biosensor integrated on ferroelectric tunnel junction. Adv Mater. 2024; 36(32): 2404026.

[21]

Pu X, An S, Tang Q, Guo H, Hu C. Wearable triboelectric sensors for biomedical monitoring and human-machine interface. iScience. 2021; 24(1): 102027.

[22]

Wang K, Yap LW, Gong S, Wang R, Wang SJ, Cheng W. Nanowire-based soft wearable human-machine interfaces for future virtual and augmented reality applications. Adv Funct Mater. 2021; 31(39): 2008347.

[23]

Araromi OA, Graule MA, Dorsey KL, et al. Ultra-sensitive and resilient compliant strain gauges for soft machines. Nature. 2020; 587(7833): 219-224.

[24]

Sundaram S, Kellnhofer P, Li Y, Zhu J-Y, Torralba A, Matusik W. Learning the signatures of the human grasp using a scalable tactile glove. Nature. 2019; 569(7758): 698-702.

[25]

Zhao L, Xu H, Liu L, Zheng Y, Han W, Wang L. MXene-induced flexible, water-retention, semi-interpenetrating network hydrogel for ultra-stable strain sensors with real-time gesture recognition. Adv Sci. 2023; 10(30): 2303922.

[26]

Zhao T, Liu J, Wang Y, Liu H, Chen Y. Towards low-cost sign language gesture recognition leveraging wearables. IEEE Trans Mob Comput. 2021; 20(4): 1685-1701.

[27]

Truong H, Zhang S, Muncuk U, et al. Capband: battery-free successive capacitance sensing wristband for hand gesture recognition. Proceedings of the 16th ACM Conference on Embedded Networked Sensor Systems. Association for Computing Machinery; 2018: 54-67.

[28]

Khomami SA, Shamekhi S. Persian sign language recognition using IMU and surface EMG sensors. Measurement. 2021; 168: 108471.

[29]

Moin A, Zhou A, Rahimi A, et al. A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition. Nat Electron. 2020; 4(1): 54-63.

[30]

Jiang S, Lv B, Guo W, et al. Feasibility of wrist-worn, real-time hand, and surface gesture recognition via sEMG and IMU sensing. IEEE Trans Industr Inform. 2018; 14(8): 3376-3385.

[31]

Tan P, Han X, Zou Y, et al. Self-powered gesture recognition wristband enabled by machine learning for full keyboard and multicommand input. Adv Mater. 2022; 34(21): 2200793.

[32]

Kim KK, Kim M, Pyun K, et al. A substrate-less nanomesh receptor with meta-learning for rapid hand task recognition. Nat Electron. 2023; 6(1): 64-75.

[33]

An S, Pu X, Zhou S, et al. Deep learning enabled neck motion detection using a triboelectric Nanogenerator. ACS Nano. 2022; 16(6): 9359-9367.

[34]

Pan J, Luo Y, Li Y, Tham C-K, Heng C-H, Thean AV-Y. A wireless multi-channel capacitive sensor system for efficient glove-based gesture recognition with AI at the edge. IEEE Trans Circuits Syst II Express Briefs. 2020; 67(9): 1624-1628.

[35]

Fan T, Liu Z, Luo Z, et al. Analog sensing and computing systems with low power consumption for gesture recognition. Adv Intell Syst. 2020; 3(1): 2000184.

[36]

Liang X, Li H, Wang W, et al. Fusion of wearable and contactless sensors for intelligent gesture recognition. Adv Intell Syst. 2019; 1(7): 1900088.

[37]

Wu R, Seo S, Ma L, Bae J, Kim T. Full-fiber auxetic-interlaced yarn sensor for sign-language translation glove assisted by artificial neural network. Nano-Micro Lett. 2022; 14(1): 139.

[38]

Zhang Y, Huang Y, Sun X, et al. Static and dynamic human arm/hand gesture capturing and recognition via multiinformation fusion of flexible strain sensors. IEEE Sensors J. 2020; 20(12): 6450-6459.

[39]

Amin MS, Amin MT, Latif MY, Jathol AA, Ahmed N, Tarar MIN. Alphabetical gesture recognition of American sign language using E-voice smart glove. IEEE 23rd International Multitopic Conference (INMIC). IEEE; 2020: 1-6.

[40]

Li L, Jiang S, Shull PB, Gu G. SkinGest: artificial skin for gesture recognition via filmy stretchable strain sensors. Adv Robot. 2018; 32(21): 1112-1121.

[41]

Faisal MAA, Abir FF, Ahmed MU, Ahad MAR. Exploiting domain transformation and deep learning for hand gesture recognition using a low-cost dataglove. Sci Rep. 2022; 12(1): 21446.

[42]

Wang Z, Zhao T, Ma J, et al. Hear sign language: a real-time end-to-end sign language recognition system. IEEE Trans Mob Comput. 2020; 21(7): 2398-2410.

[43]

Lee BG, Lee SM. Smart wearable hand device for sign language interpretation system with sensors fusion. IEEE Sensors J. 2018; 18(3): 1224-1232.

[44]

Tubaiz N, Shanableh T, Assaleh K. Glove-based continuous Arabic sign language recognition in user-dependent mode. IEEE Trans Hum Mach Syst. 2015; 45(4): 526-533.

[45]

Shukor AZ, Miskon MF, Jamaluddin MH, Ali@Ibrahim F, Asyraf MF, Bahar MB. A new data glove approach for Malaysian sign language detection. Proc Comput Sci. 2015; 76: 60-67.

[46]

Zhou Z, Chen K, Li X, et al. Sign-to-speech translation using machine-learning-assisted stretchable sensor arrays. Nat Electron. 2020; 3(9): 571-578.

[47]

Liu Y, Jiang X, Yu X, et al. A wearable system for sign language recognition enabled by a convolutional neural network. Nano Energy. 2023; 116: 108767.

[48]

Liu S, Zhang J, Zhang Y, Zhu R. A wearable motion capture device able to detect dynamic motion of human limbs. Nat Commun. 2020; 11(1): 5615.

[49]

Mittal A, Kumar P, Roy PP, Balasubramanian R, Chaudhuri BB. A modified LSTM model for continuous sign language recognition using leap motion. IEEE Sensors J. 2019; 19(16): 7056-7063.

[50]

Wen F, Zhang Z, He T, Lee C. AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove. Nat Commun. 2021; 12(1): 5378.

[51]

Wang Y. Research progress on a novel conductive polymer—poly(3,4-ethylenedioxythiophene) (PEDOT). J Phys Conf Ser. 2009; 152(1): 012023.

[52]

Mao Q. Multimodal tactile sensing fused with vision for dexterous robotic housekeeping. Nat Commun. 2024; 15(1): 6871.

[53]

Wang L, Zhu R, Li G. Temperature and strain compensation for flexible sensors based on thermosensation. ACS Appl Mater Interfaces. 2019; 12(1): 1953-1961.

RIGHTS & PERMISSIONS

2024 The Author(s). InfoMat published by UESTC and John Wiley & Sons Australia, Ltd.

AI Summary AI Mindmap
PDF

84

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/