Attention emotion recognition via ECG signals

Aihua Mao, Zihui Du, Dayu Lu, Jie Luo

PDF(6910 KB)
PDF(6910 KB)
Quant. Biol. ›› 2022, Vol. 10 ›› Issue (3) : 276-286. DOI: 10.15302/J-QB-021-0267
RESEARCH ARTICLE
RESEARCH ARTICLE

Attention emotion recognition via ECG signals

Author information +
History +

Abstract

Background: Physiological signal-based research has been a hot topic in affective computing. Previous works mainly focus on some strong, short-lived emotions (e.g., joy, anger), while the attention, which is a weak and long-lasting emotion, receives less attraction. In this paper, we present a study of attention recognition based on electrocardiogram (ECG) signals, which contain a wealth of information related to emotions.

Methods: The ECG dataset is derived from 10 subjects and specialized for attention detection. To relieve the impact of noise of baseline wondering and power-line interference, we apply wavelet threshold denoising as preprocessing and extract rich features by pan-tompkins and wavelet decomposition algorithms. To improve the generalized ability, we tested the performance of a variety of combinations of different feature selection algorithms and classifiers.

Results: Experiments show that the combination of generic algorithm and random forest achieve the highest correct classification rate (CCR) of 86.3%.

Conclusion: This study indicates the feasibility and bright future of ECG-based attention research.

Author summary

Our work aims to discover the connection between ECG signals and attentive emotion, and proves the feasibility of applying ECG signal in attention recognition.

Graphical abstract

Keywords

affective computing / attention recognition / ECG signals

Cite this article

Download citation ▾
Aihua Mao, Zihui Du, Dayu Lu, Jie Luo. Attention emotion recognition via ECG signals. Quant. Biol., 2022, 10(3): 276‒286 https://doi.org/10.15302/J-QB-021-0267

References

[1]
Picard R.. ( 2000) Affective Computing. Cambridge: MIT press
[2]
Ekman, P. Friesen, W. ( 1971). Constants across cultures in the face and emotion. J. Pers. Soc. Psychol., 17 : 124– 129
CrossRef Pubmed Google scholar
[3]
Fox, C. J. Barton, J. ( 2007). What is adapted in face adaptation? The neural representations of expression in the human visual system.. Brain Res., 1127 : 80– 89
CrossRef Pubmed Google scholar
[4]
Batty, M. Taylor, M. ( 2003). Early processing of the six basic facial emotional expressions. Brain Res. Cogn. Brain Res., 17 : 613– 620
CrossRef Pubmed Google scholar
[5]
Perikos, I. ( 2016). Recognizing emotions in text using ensemble of classifiers. Eng. Appl. Artif. Intell., 51 : 191– 201
CrossRef Google scholar
[6]
Plutchik, R. ( 2001). The nature of emotions. Am. Sci., 89 : 344– 350
CrossRef Google scholar
[7]
Lang, P. ( 1995). The emotion probe. Studies of motivation and attention. Am. Psychol., 50 : 372– 385
CrossRef Pubmed Google scholar
[8]
Kuo, Y. Chu, H. Tsai, M. ( 2017). Effects of an integrated physiological signal-based attention-promoting and English listening system on students’ learning performance and behavioral patterns. Comput. Human Behav., 75 : 218– 227
CrossRef Google scholar
[9]
Song, T., Zheng, W., Song, P. ( 2020). EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans. Affect. Comput., 11 : 532– 541
[10]
Hsu, Y., Wang, J., Chiang, W. ( 2020). Automatic ECG-based emotion recognition in music listening. IEEE Trans. Affect. Comput., 11 : 85– 99
CrossRef Google scholar
[11]
Liu, Y., Yu, M., Zhao, G., Song, J., Ge, Y. ( 2018). Real-time movie-induced discrete emotion recognition from EEG signals. IEEE Trans. Affect. Comput., 9 : 550– 562
CrossRef Google scholar
[12]
Ding, Y., Hu, X., Xia, Z., Liu, Y. ( 2021). Inter-brain EEG feature extraction and analysis for continuous implicit emotion tagging during video watching. IEEE Trans. Affect. Comput., 12 : 92– 102
CrossRef Google scholar
[13]
Du, X., Ma, C., Zhang, G., Li, J., Lai, Y. Zhao, G., Deng, X., Liu, Y. ( 2020). An efficient LSTM network for emotion recognition from multichannel EEG signals. IEEE Trans. Affect. Comput., 3013711
CrossRef Google scholar
[14]
Zhang, G., Yu, M., Liu, Y. Zhao, G., Zhang, D. ( 2021). SparseDGCNN: Recognizing emotion from multichannel EEG signals. IEEE Trans. Affect. Comput., 3051332
CrossRef Google scholar
[15]
Pourtois, G., Schettino, A. ( 2013). Brain mechanisms for emotional influences on perception and attention: what is magic and what is not. Biol. Psychol., 92 : 492– 512
CrossRef Pubmed Google scholar
[16]
Taylor, J. G. Fragopanagos, N. ( 2005). The interaction of attention and emotion. Neural Netw., 18 : 353– 369
CrossRef Pubmed Google scholar
[17]
Aliakbaryhosseinabadi, S., Kamavuako, E. N., Jiang, N., Farina, D. ( 2017). Classification of EEG signals to identify variations in attention during motor task execution. J. Neurosci. Methods, 284 : 27– 34
CrossRef Pubmed Google scholar
[18]
Liu, N. H., Chiang, C. Y. Chu, H. ( 2013). Recognizing the degree of human attention using EEG signals from mobile sensors. Sensors (Basel), 13 : 10273– 10286
CrossRef Pubmed Google scholar
[19]
Hamadicharef B., Zhang H., Guan C., Wang C., Phua K. S., Tee K. P. Ang K.. ( 2009) Learning EEG-based spectral-spatial patterns for attention level measurement. In: IEEE Inter. Symp. Circ. Syst., pp. 1465– 1468
[20]
Eddin Alchalabi A., Elsharnouby M., Shirmohammadi S.. ( 2017) Feasibility of detecting ADHD patients’attention levels by classifying their EEG signals. In: 2017 IEEE Inter. Symp. Medic. Measur. Applic. (MeMeA), pp. 314– 319
[21]
Ghanadian, H., Ghodratigohar, M. ( 2018). A machine learning method to improve non-contact heart rate monitoring using an RGB camera. IEEE Access, 6 : 57085– 57094
CrossRef Google scholar
[22]
Egger, M., Ley, M. ( 2019). Emotion recognition from physiological signal analysis: A review. Electron. Notes Theor. Comput. Sci., 343 : 35– 55
CrossRef Google scholar
[23]
Emanet N.. ( 2009) ECG beat classification by using discrete wavelet transform and Random Forest algorithm. In: 2009 Fifth Inter. Confer. Soft Comput., Comput. Words Percept. Syst. Anal., Decis. Contr., Famag., 5379457
[24]
Zhang, Y. D., Yang, Z. J., Lu, H. M., Zhou, X. X., Phillips, P., Liu, Q. M. Wang, S. ( 2016). Facial emotion recognition based on biorthogonal wavelet entropy, fuzzy support vector machine, and stratified cross validation. IEEE Access, 4 : 8375– 8385
CrossRef Google scholar
[25]
Desimone, R. ( 1995). Neural mechanisms of selective visual attention. Annu. Rev. Neurosci., 18 : 193– 222
CrossRef Pubmed Google scholar
[26]
Agante P. M. Marques de Sa J.. ( 1999) ECG noise filtering using wavelets with soft-thresholding methods. In: Proc. Comput. Cardiology 1999, pp. 535– 538
[27]
Lu, G., Brittain, J. S., Holland, P., Yianni, J., Green, A. L., Stein, J. F., Aziz, T. Z. ( 2009). Removing ECG noise from surface EMG signals using adaptive filtering. Neurosci. Lett., 462 : 14– 19
CrossRef Pubmed Google scholar
[28]
Donoho D. L. Johnstone I.. ( 1995) Adapting to unknown smoothness via wavelet shrinkage. J. Am. Stat. Assoc., 90, 1200– 1224.
[29]
Lenis, G., Pilia, N., Loewe, A., Schulze, W. H. ( 2017). Comparison of baseline wander removal techniques considering the preservation of ST changes in the ischemic ECG: A simulation Study. Comput. Math. Methods. Med., 2017 : 9295029
[30]
Pan, J. Tompkins, W. ( 1985). A real-time QRS detection algorithm. IEEE Trans. Biomed. Eng., 32 : 230– 236
CrossRef Pubmed Google scholar
[31]
Hamilton, P. S. Tompkins, W. ( 1986). Quantitative investigation of QRS detection rules using the MIT/BIH arrhythmia database. IEEE Trans. Biomed. Eng., 33 : 1157– 1165
CrossRef Pubmed Google scholar
[32]
Liu C., Rani P.. ( 2005) An empirical study of machine learning techniques for affect recognition in human-robot interaction. In: 2005 IEEE/RSJ Inter. Confer. Intellig. Robots Syst., pp. 2662– 2667
[33]
Website: https://www.shimmersensing.com/products/. Accessed: January 5, 2021
[34]
Bradley, M. M. Lang, P. ( 1994). Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry, 25 : 49– 59
CrossRef Pubmed Google scholar

ACKNOWLEDGEMENTS

The work of this paper is financially supported by NSF of Guangdong Province (No. 2019A1515010833) and the Fundamental Research Funds for the Central Universities (No. 2020ZYGXZR089), and the Social Science Research Base of Guangdong Province-Research Center of Network Civilization in New Era of SCUT.

COMPLIANCE WITH ETHICS GUIDELINES

The authors Aihua Mao, Zihui Du, Dayu Lu and Jie Luo declare that they have no conflict of interest or financial conflicts to disclose. All procedures performed in studies were in accordance with the ethical standards of the institution or practice at which the studies were conducted, and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

OPEN ACCESS

This article is licensed by the CC By under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

RIGHTS & PERMISSIONS

2021 The Author(s). Published by Higher Education Press.
AI Summary AI Mindmap
PDF(6910 KB)

Accesses

Citations

Detail

Sections
Recommended

/