ArtEEGAttention: an advanced deep learning approach for art brain decoding

Shuming HU , Shu ZHANG , Ying ZHANG , Zhu WANG , Bin GUO , Zhiwen YU

Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (8) : 2008346

PDF (3510KB)
Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (8) : 2008346 DOI: 10.1007/s11704-025-50344-w
Artificial Intelligence
RESEARCH ARTICLE

ArtEEGAttention: an advanced deep learning approach for art brain decoding

Author information +
History +
PDF (3510KB)

Abstract

The capacity to interpret the brain’s processing of visual art via brain imaging techniques provides significant understanding of the cognitive mechanisms behind aesthetic appreciation. This study investigates these mechanisms through analyzing electroencephalography (EEG) data from participants performing two different tasks: gazing at a blank wall and viewing artworks. We created the ArtEEGAttention model, a novel deep learning architecture that employs sliding window convolution and multi-head self-attention mechanisms to accurately identify these varied viewing scenarios. Evaluated on a selected dataset of 16 individuals, with EEG signals separated into 3-second epochs and classified according to viewing environment, our model exhibited outstanding performance, with a remarkable cross-subject accuracy of 77.96%. The model’s remarkable accuracy, especially evident in specific subjects, highlights its robustness and superior generalization skills across various brain responses to art.

Graphical abstract

Keywords

electroencephalography / deep learning / neuroaesthetics / visual art

Cite this article

Download citation ▾
Shuming HU, Shu ZHANG, Ying ZHANG, Zhu WANG, Bin GUO, Zhiwen YU. ArtEEGAttention: an advanced deep learning approach for art brain decoding. Front. Comput. Sci., 2026, 20(8): 2008346 DOI:10.1007/s11704-025-50344-w

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Chen L, Hoey J, Nugent C D, Cook D J, Yu Z . Sensor-based activity recognition. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 2012, 42( 6): 790–808

[2]

Chen K, Zhang D, Yao L, Guo B, Yu Z, Liu Y . Deep learning for sensor-based human activity recognition: overview, challenges, and opportunities. ACM Computing Surveys, 2022, 54( 4): 77

[3]

Deng Y, Loy C C, Tang X . Image aesthetic assessment: an experimental survey. IEEE Signal Processing Magazine, 2017, 34( 4): 80–106

[4]

Celona L, Leonardi M, Napoletano P, Rozza A . Composition and style attributes guided image aesthetic assessment. IEEE Transactions on Image Processing, 2022, 31: 5009–5024

[5]

Yang Y, Xu L, Li L, Qie N, Li Y, Zhang P, Guo Y. Personalized image aesthetics assessment with rich attributes. In: Proceedings of 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 2022, 19829–19837

[6]

Pan B, Wang S, Jiang Q. Image aesthetic assessment assisted by attributes through adversarial learning. In: Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019, 679–686

[7]

Talebi H, Milanfar P . NIMA: neural image assessment. IEEE Transactions on Image Processing, 2018, 27( 8): 3998–4011

[8]

Hosu V, Goldlücke B, Saupe D. Effective aesthetics prediction with multi-level spatially pooled features. In: Proceedings of 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 2019, 9367–9375

[9]

Lv P, Fan J, Nie X, Dong W, Jiang X, Zhou B, Xu M, Xu C . User-guided personalized image aesthetic assessment based on deep reinforcement learning. IEEE Transactions on Multimedia, 2023, 25: 736–749

[10]

Zhao Z, Lu P, Zhang A, Li P, Li X, Liu X, Hu Y, Chen S, Wang L, Guo W. Can machines understand composition? dataset and benchmark for photographic image composition embedding and understanding. In: Proceedings of the Computer Vision and Pattern Recognition Conference. 2025, 14411–14421

[11]

Sevilla J, Meyer R J . Leaving something for the imagination: the effect of visual concealment on preferences. Journal of Marketing, 2020, 84( 4): 109–126

[12]

Daryanavard Chounchenani M, Shahbahrami A, Hassanpour R, Gaydadjiev G . Deep learning based image aesthetic quality assessment-a review. ACM Computing Surveys, 2025, 57( 7): 183

[13]

Li R, Zhang J . Review of computational neuroaesthetics: bridging the gap between neuroaesthetics and computer science. Brain Informatics, 2020, 7( 1): 16

[14]

Zeki S. Inner Vision: An Exploration of Art and the Brain. Oxford, New York: Oxford University Press, 2000

[15]

Chatterjee A . Neuroaesthetics: a coming of age story. Journal of Cognitive Neuroscience, 2011, 23( 1): 53–62

[16]

Chatterjee A, Vartanian O . Neuroaesthetics. Trends in Cognitive Sciences, 2014, 18( 7): 370–375

[17]

Nadal M, Chatterjee A . Neuroaesthetics and art’s diversity and universality. WIREs Cognitive Science, 2019, 10( 3): e1487

[18]

Zeki S. Art and the brain. In: Changeux J P, Edelman G M, eds. The Brain. New York: Routledge, 2017

[19]

Kontson K L, Megjhani M, Brantley J A, Cruz-Garza J G, Nakagome S, Robleto D, White M, Civillico E, Contreras-Vidal J L . Your brain on art: emergent cortical dynamics during aesthetic experiences. Frontiers in Human Neuroscience, 2015, 9: 626

[20]

Cruz-Garza J G, Brantley J A, Nakagome S, Kontson K, Megjhani M, Robleto D, Contreras-Vidal J L . Deployment of mobile EEG technology in an art museum setting: evaluation of signal quality and usability. Frontiers in Human Neuroscience, 2017, 11: 527

[21]

Ziems M. ECE brain-machine interface expert teams up with artist at Menil collection. See egr.uh.edu/news/201408/ece-brain-machine-interface-expert-teams-artist-menil-collection website, 2014

[22]

Zeki S, Marini L . Three cortical stages of colour processing in the human brain. Brain, 1998, 121( 9): 1669–1685

[23]

O’Doherty J, Winston J, Critchley H, Perrett D, Burt D M, Dolan R J . Beauty in a smile: the role of medial orbitofrontal cortex in facial attractiveness. Neuropsychologia, 2003, 41( 2): 147–155

[24]

Kawabata H, Zeki S . Neural correlates of beauty. Journal of Neurophysiology, 2004, 91( 4): 1699–1705

[25]

Ishizu T, Zeki S . Toward a brain-based theory of beauty. PLoS One, 2011, 6( 7): e21852

[26]

Boccia M, Barbetti S, Piccardi L, Guariglia C, Ferlazzo F, Giannini A M, Zaidel D W . Where does brain neural activation in aesthetic responses to visual art occur? Meta-analytic evidence from neuroimaging studies. Neuroscience & Biobehavioral Reviews, 2016, 60: 65–71

[27]

Müller-Bardorff M, Bruchmann M, Mothes-Lasch M, Zwitserlood P, Schlossmacher I, Hofmann D, Miltner W, Straube T . Early brain responses to affective faces: a simultaneous EEG-fMRI study. NeuroImage, 2018, 178: 660–667

[28]

Vessel E A, Isik A I, Belfi A M, Stahl J L, Starr G G . The default-mode network represents aesthetic appeal that generalizes across visual domains. Proceedings of the National Academy of Sciences of the United States of America, 2019, 116( 38): 19155–19164

[29]

Iigaya K, Yi S, Wahle I A, Tanwisuth K, O’Doherty J P . Aesthetic preference for art can be predicted from a mixture of low- and high-level visual features. Nature Human Behaviour, 2021, 5( 6): 743–755

[30]

Iigaya K, Yi S, Wahle I A, Tanwisuth S, Cross L, O’Doherty J P . Neural mechanisms underlying the hierarchical construction of perceived aesthetic value. Nature Communications, 2023, 14( 1): 127

[31]

Schirrmeister R T, Springenberg J T, Fiederer L D J, Glasstetter M, Eggensperger K, Tangermann M, Hutter F, Burgard W, Ball T . Deep learning with convolutional neural networks for EEG decoding and visualization. Human Brain Mapping, 2017, 38( 11): 5391–5420

[32]

Lawhern V J, Solon A J, Waytowich N R, Gordon S M, Hung C P, Lance B J . EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. Journal of Neural Engineering, 2018, 15( 5): 056013

[33]

Kostas D, Rudzicz F . Thinker invariance: enabling deep neural networks for BCI across more people. Journal of Neural Engineering, 2020, 17( 5): 056008

[34]

Perslev M, Darkner S, Kempfner L, Nikolic M, Jennum P J, Igel C . U-Sleep: resilient high-frequency sleep staging. npj Digital Medicine, 2021, 4( 1): 72

[35]

Song Y, Zheng Q, Liu B, Gao X . EEG conformer: convolutional transformer for EEG decoding and visualization. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2023, 31: 710–719

[36]

Altaheri H, Muhammad G, Alsulaiman M . Physics-informed attention temporal convolutional network for EEG-based motor imagery classification. IEEE Transactions on Industrial Informatics, 2023, 19( 2): 2249–2258

[37]

Hu S, Yu Z, Xu E, Wang Y, Zhang Y, Guo B. An improved dual stream DGCNN method for fine-grained positive emotion classification. In: Proceedings of the 11th International Conference on Behavioural and Social Computing (BESC). 2024, 1–6

[38]

Zheng W L, Lu B L . Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Transactions on Autonomous Mental Development, 2015, 7( 3): 162–175

[39]

Santamaría-Vázquez E, Martínez-Cagigal V, Vaquerizo-Villar F, Hornero R . EEG-inception: a novel deep convolutional neural network for assistive ERP-based brain-computer interfaces. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2020, 28( 12): 2773–2782

[40]

Salami A, Andreu-Perez J, Gillmeister H . EEG-ITNet: an explainable inception temporal convolutional network for motor imagery classification. IEEE Access, 2022, 10: 36672–36685

RIGHTS & PERMISSIONS

Higher Education Press

AI Summary AI Mindmap
PDF (3510KB)

Supplementary files

Highlights

308

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/