Deep Learning-Assisted Electronic Skin System Capable of Capturing Spatiotemporal and Mechanical Features of Social Touch to Enhance Human–Robot Emotion Recognition

Jinrong Huang , Yuqiong Sun , Yongchang Jiang , Jie-an Li , Xidi Sun , Xun Cao , Youdou Zheng , Lijia Pan , Yi Shi

SmartMat ›› 2025, Vol. 6 ›› Issue (1) : e1325

PDF
SmartMat ›› 2025, Vol. 6 ›› Issue (1) : e1325 DOI: 10.1002/smm2.1325
RESEARCH ARTICLE

Deep Learning-Assisted Electronic Skin System Capable of Capturing Spatiotemporal and Mechanical Features of Social Touch to Enhance Human–Robot Emotion Recognition

Author information +
History +
PDF

Abstract

In human interactions, social touch communication is widely used to convey emotions, emphasizing its critical role in advancing human–robot interactions by enabling robots to understand and respond to human emotions, thereby significantly enhancing their service capabilities. However, the challenge is to dynamically capture social touch with sufficient spatiotemporal and mechanical resolution for deep haptic data analysis. This study presents a robotic system with flexible electronic skin and a high-frequency signal circuit, utilizing deep neural networks to recognize social touch emotions. The electronic skin, made from double cross-linked ionogels and microstructured arrays, has a low force detection threshold (8 Pa) and a wide perception range (0–150 kPa), enhancing the mechanical resolution of touch signals. By incorporating a high-speed readout circuit capable of capturing spatiotemporal features of social touch gesture information at 30 Hz, the system facilitates precise analysis of touch interactions. A 3D convolutional neural network with a Squeeze-and-Excitation Attention module achieves 87.12% accuracy in recognizing social touch gestures, improving the understanding of emotions conveyed through touch. The effectiveness of the system is validated through interactive demonstrations with robotic dogs and humanoid robots, demonstrating its potential to enhance the emotional intelligence of robots.

Keywords

deep learning / electronic skin / human–robot interaction / ionogels / piezocapacitance

Cite this article

Download citation ▾
Jinrong Huang, Yuqiong Sun, Yongchang Jiang, Jie-an Li, Xidi Sun, Xun Cao, Youdou Zheng, Lijia Pan, Yi Shi. Deep Learning-Assisted Electronic Skin System Capable of Capturing Spatiotemporal and Mechanical Features of Social Touch to Enhance Human–Robot Emotion Recognition. SmartMat, 2025, 6(1): e1325 DOI:10.1002/smm2.1325

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

C. Xu, S. A. Solomon, and W. Gao, “Artificial Intelligence-Powered Electronic Skin,” Nature Machine Intelligence 5, no. 12 (2023): 1344–1355.

[2]

T. P. Spexard, M. Hanheide, and G. Sagerer, “Human-Oriented Interaction With an Anthropomorphic Robot,” IEEE Transactions on Robotics 23, no. 5 (2007): 852–862.

[3]

T. Li, T. Zhao, H. Zhang, et al., “A Skin-Conformal and Breathable Humidity Sensor for Emotional Mode Recognition and Non-Contact Human-Machine Interface,” NPJ Flexible Electronics 8, no. 1 (2024): 3.

[4]

X. Liao, W. Song, X. Zhang, et al., “A Bioinspired Analogous Nerve Towards Artificial Intelligence,” Nature Communications 11, no. 1 (2020): 268.

[5]

Y. Zhang, J. Yang, X. Hou, et al., “Highly Stable Flexible Pressure Sensors With a Quasi-Homogeneous Composition and Interlinked Interfaces,” Nature Communications 13, no. 1 (2022): 1317.

[6]

J. Wen, L. Zhou, and T. Ye, “Polymer IIonogels and Their Application in Flexible Ionic Devices,” SmartMat 5, no. 2 (2024): e1253.

[7]

T. Tashima, S. Saito, T. Kudo, M. Osumi, and T. Shibata, “Interactive Pet Robot With an Emotion Model,” Sci Robot 13, no. 3 (2012): 225–226.

[8]

W. Wang, Y. Jiang, D. Zhong, et al., “Neuromorphic Sensorimotor Loop Embodied by Monolithically Integrated, Low-Voltage, Soft E-Skin,” Science 380, no. 6646 (2023): 735–742.

[9]

Y. Yu, J. Nassar, C. Xu, et al., “Biofuel-Powered Soft Electronic Skin With Multiplexed and Wireless Sensing for Human-Machine Interfaces,” Science Robotics 5, no. 41 (2020): eaaz7946.

[10]

R. Yin, D. Wang, S. Zhao, Z. Lou, and G. Shen, “Wearable Sensors-Enabled Human-Machine Interaction Systems: From Design to Application,” Advanced Functional Materials 31, no. 11 (2021): 2008936.

[11]

Y. Liu, Y. Gao, B. J. Kim, et al., “Stretchable Hybrid Platform-Enabled Interactive Perception of Strain Sensing and Visualization,” SmartMat 5, no. 4 (2023): e1247.

[12]

L. Gao, N. Zhao, H. Xu, et al., “Flexible Pressure Sensor With Wide Linear Sensing Range for Human-Machine Interaction,” IEEE Transactions on Electron Devices 69, no. 7 (2022): 3901–3907.

[13]

Y. Lin, S. Duan, D. Zhu, Y. Li, B. Wang, and J. Wu, “Self-Powered and Interface-Independent Tactile Sensors Based on Bilayer Single-Electrode Triboelectric Nanogenerators for Robotic Electronic Skin,” Advanced Intelligent Systems 5, no. 4 (2023): 2100120.

[14]

S. Pyo, J. Lee, K. Bae, S. Sim, and J. Kim, “Recent Progress in Flexible Tactile Sensors for Human-Interactive Systems: From Sensors to Advanced Applications,” Advanced Materials 33, no. 47 (2021): 2005902.

[15]

L. Wang and Y. Li, “A Review for Conductive Polymer Piezoresistive Composites and a Development of a Compliant Pressure Transducer,” IEEE Transactions on Instrumentation and Measurement 62, no. 2 (2013): 495–502.

[16]

K. R. Pyun, K. Kwon, M. J. Yoo, et al., “Machine-Learned Wearable Sensors for Real-Time Hand-Motion Recognition: Toward Practical Applications,” National Science Review 11, no. 2 (2023): nwad298.

[17]

Y. Luo, X. Xiao, J. Chen, Q. Li, and H. Fu, “Machine-Learning-Assisted Recognition on Bioinspired Soft Sensor Arrays,” ACS Nano 16, no. 4 (2022): 6734–6743.

[18]

W. W. Lee, Y. J. Tan, H. Yao, et al., “A Neuro-Inspired Artificial Peripheral Nervous System for Scalable Electronic Skins,” Science Robotics 4, no. 32 (2019): aax2198.

[19]

S. Xiang, J. Tang, L. Yang, Y. Guo, Z. Zhao, and W. Zhang, “Deep Learning-Enabled Real-Time Personal Handwriting Electronic Skin With Dynamic Thermoregulating Ability,” NPJ Flexible Electronics 6, no. 1 (2022): 59.

[20]

Y. Yan, Z. Hu, Z. Yang, et al., “Soft Magnetic Skin for Super-Resolution Tactile Sensing With Force Self-Decoupling,” Science Robotics 6, no. 51 (2021): eabc8801.

[21]

Y. Sun, J. Huang, Y. Cheng, J. Zhang, Y. Shi, and L. Pan, “High-Accuracy Dynamic Gesture Recognition: A Universal and Self-Adaptive Deep-Learning-Assisted System Leveraging High-Performance Ionogels-Based Strain Sensors,” SmartMat 1 (2024): e1269.

[22]

X. Lin, H. Xue, F. Li, H. Mei, H. Zhao, and T. Zhang, “All-Nanofibrous Ionic Capacitive Pressure Sensor for Wearable Applications,” ACS Applied Materials & Interfaces 14, no. 27 (2022): 31385–31395.

[23]

M. Zhang, M. Gu, L. Shao, et al., “Flexible Wearable Capacitive Sensors Based on Ionic Gel With Full-Pressure Ranges,” ACS Applied Materials & Interfaces 15, no. 12 (2023): 15884–15892.

[24]

Y. Yuan, B. Liu, M. R. Adibeig, et al., “Microstructured Polyelectrolyte Elastomer-Based Ionotronic Sensors With High Sensitivities and Excellent Stability for Artificial Skins,” Advanced Materials 36, no. 11 (2024): 2310429.

[25]

Y. Gao, H. Zhang, B. Song, C. Zhao, and Q. Lu, “Electric Double Layer Based Epidermal Electronics for Healthcare and Human-Machine Interface,” Biosensors 13, no. 8 (2023): 787.

[26]

M. Wang, P. Zhang, M. Shamsi, et al., “Tough and Stretchable Ionogels By in Situ Phase Separation,” Nature Materials 21, no. 3 (2022): 359–365.

[27]

Z. Zhang, X. Gui, Q. Hu, et al., “Highly Sensitive Capacitive Pressure Sensor Based on a Micropyramid Array for Health and Motion Monitoring,” Advanced Electronic Materials 7, no. 7 (2021): 2100174.

[28]

C. Ge, B. Yang, L. Wu, et al., “Capacitive Sensor Combining Proximity and Pressure Sensing for Accurate Grasping of a Prosthetic Hand,” ACS Applied Electronic Materials 4, no. 2 (2022): 869–877.

[29]

H. Niu, X. Wei, H. Li, et al., “Micropyramid Array Bimodal Electronic Skin for Intelligent Material and Surface Shape Perception Based on Capacitive Sensing,” Advanced Science 11, no. 3 (2024): 2305528.

[30]

S. Afroj, S. Tan, A. M. Abdelkader, K. S. Novoselov, and N. Karim, “Highly Conductive, Scalable, and Machine Washable Graphene-Based E-Textiles for Multifunctional Wearable Electronic Applications,” Advanced Functional Materials 30, no. 23 (2020): 202000293.

[31]

J. Huang, H. Wang, J. Li, et al., “High-Performance Flexible Capacitive Proximity and Pressure Sensors With Spiral Electrodes for Continuous Human-Machine Interaction,” ACS Materials Letters 4, no. 11 (2022): 2261–2272.

[32]

L. Chen, Y. Xu, Y. Liu, et al., “Flexible and Transparent Electronic Skin Sensor With Sensing Capabilities for Pressure, Temperature, and Humidity,” ACS Applied Materials & Interfaces 15, no. 20 (2023): 24923–24932.

[33]

H. Zhang, H. Chen, J.-H. Lee, et al., “Mechanochromic Optical/Electrical Skin for Ultrasensitive Dual-Signal Sensing,” ACS Nano 17, no. 6 (2023): 5921–5934.

[34]

S. Zhuo, C. Song, Q. Rong, T. Zhao, and M. Liu, “Shape and Stiffness Memory Ionogels With Programmable Pressure-Resistance Response,” Nature Communications 13, no. 1 (2022): 1743.

[35]

X. Zhang, S. Zeng, Z. Hu, et al., “Bioinspired Gradient Poly(Ionic Liquid) Ionogels for Ionic Skins With an Ultrawide Pressure Detection Range,” ACS Materials Letters 4, no. 12 (2022): 2459–2468.

[36]

Y. Xu, L. Chen, J. Chen, X. Chang, and Y. Zhu, “Flexible and Transparent Pressure/Temperature Sensors Based on Ionogels With Bioinspired Interlocked Microstructures,” ACS Applied Materials & Interfaces 14, no. 1 (2022): 2122–2131.

[37]

Z. Q. Shen, X. Y. Zhu, C. Majidi, and G. Gu, “Ionogel Mechanoreceptors for Soft Machines, Physiological Sensing, and Amputee Prostheses,” Advanced Materials 33, no. 38 (2021): 2102069.

[38]

S. Albawi, O. Bayat, S. Al-Azawi, and O. N. Ucan, “Social Touch Gesture Recognition Using Convolutional Neural Network,” Computational Intelligence and Neuroscience 2018, no. 6973103 (2018): 1–10.

[39]

V.-C. Ta, W. Johal, M. Portaz, E. Castelli, and D. Vaufreydaz, “ The Grenoble System For The Social Touch Challenge at ICMI 2015,” in Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (New York: ACM, 2015), 391–398.

[40]

M. M. Jung, M. Poel, R. Poppe, and D. K. J. Heylen, “Automatic Recognition of Touch Gestures in the Corpus of Social Touch,” Journal on Multimodal User Interfaces 11, no. 1 (2016): 81–96.

[41]

G. Zhang, Q. Liu, Y. Shi, and H. Meng, “An Ensemble Classifier Based on Three-Way Decisions for Social Touch Gesture Recognition,” Advances in Swarm Intelligence 10942 (2018): 370–379.

[42]

D. Hughes, A. Krauthammer, and N. Correll, “ Recognizing Social Touch Gestures Using Recurrent and Convolutional Neural Networks,” in 2017 IEEE International Conference on Robotics and Automation (ICRA) (Singapore: ICRA, 2017), 2315–2321.

[43]

Y. F. A Gaus, T. Olugbade, A. Jan, et al., “ Social Touch Gesture Recognition Using Random Forest and Boosting on Distinct Feature Sets,” in Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (New York: Association for Computing Machinery, 2015), 399–406.

[44]

Y. X. Wang, Y. K. Li, T. H. Yang, and Q. H. Meng, “Multitask Touch Gesture and Emotion Recognition Using Multiscale Spatiotemporal Convolutions With Attention Mechanism,” IEEE Sensors Journal 22, no. 16 (2022): 16190–16201.

[45]

H. Choi, D. Brouwer, M. A. Lin, et al., “ Deep Learning Classification of Touch Gestures Using Distributed Normal and Shear Force,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (Kyoto: IROS, 2022), 3659–3665.

[46]

Y. K. Li, Q. H. Meng, T. H. Yang, Y. X. Wang, and H. R. Hou, “Touch Gesture and Emotion Recognition Using Decomposed Spatiotemporal Convolutions,” IEEE Transactions on Instrumentation and Measurement 71, no. 2500809 (2022): 1–9.

[47]

S. C. Lai, H. K. Tan, and P. Y. Lau, “3D Deformable Convolution for Action Classification in Videos,” International Workshop on Advanced Imaging Technology (IWAIT) 2021, no. 117660R (2021): 11766.

[48]

B. Peng, Z. Yao, Q. Wu, H. Sun, and G. Zhou, “3D Convolutional Neural Network for Human Behavior Analysis in Intelligent Sensor Network,” Mobile Networks and Applications 27, no. 4 (2022): 1559–1568.

[49]

H. Yang, C. Yuan, B. Li, et al., “Asymmetric 3D Convolutional Neural Networks for Action Recognition,” Pattern Recognition 85 (2019): 1–12.

[50]

J. Hu, L. Shen, and G. Sun, “ Squeeze-and-Excitation Networks,” in 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (Salt Lake City, UT, USA: IEEE/CVF, 2018), 7132–7141.

[51]

D. Darlan, O. S. Ajani, V. Parque, and R. Mallipeddi, “ Recognizing Social Touch Gestures Using Optimized Class-Weighted CNN-LSTM Networks,” in 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (Busan, Korea: RO-MAN, 2023), 2024–2029.

RIGHTS & PERMISSIONS

2025 The Authors. SmartMat published by Tianjin University and John Wiley & Sons Australia, Ltd.

AI Summary AI Mindmap
PDF

289

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/