Different neural mechanisms for nonsalient trained stimuli and physically salient stimuli in visual processing

Zile Wang , Qi Zhang , Yuxiang Hao , Shuangxing Xu

Psych Journal ›› 2024, Vol. 13 ›› Issue (2) : 227 -241.

PDF
Psych Journal ›› 2024, Vol. 13 ›› Issue (2) : 227 -241. DOI: 10.1002/pchj.718
ORIGINAL ARTICLE

Different neural mechanisms for nonsalient trained stimuli and physically salient stimuli in visual processing

Author information +
History +
PDF

Abstract

Previous studies have shown that nonsalient trained stimuli could capture attention and would be actively suppressed when served as distractors. However, it was unclear whether nonsalient trained stimuli and physically salient stimuli operate through the same attentional neural mechanism. In the current study, we investigated this question by recording event-related potentials (ERPs) of searching for the two stimuli separately after matching the difficulty. The present results provided additional evidence for the function of the suppression in that it may terminate a shift of attention. For the N1 component, the nonsalient trained stimuli had a shorter latency and larger amplitude than the physically salient stimuli whether presented as targets or distractors. It indicated that the nonsalient trained stimuli had an earlier sensory processing and greater visual attention orienting. The N2 posterior-contralateral (N2pc) amplitude of the physically salient target was larger than the nonsalient trained target. This suggested that physically salient stimuli had a stronger ability to capture attention. However, when they presented as distractors, only the nonsalient trained stimuli could elicit the PD component. Therefore, active suppression of the physically salient stimuli was more difficult than the nonsalient trained stimulus with the same difficulty. For the P3 component, the amplitude of the physically salient stimuli was larger than that of the nonsalient trained stimuli, both as targets and distractors, which indicated that the top-down controlled process of outcome evaluation for the salient triangle was stronger. Overall, these results suggested that they were processed via different neural mechanisms in the early sensory processing, attentional selection, active suppression, and the outcome-evaluation process.

Keywords

EEG / neural mechanism / nonsalient trained stimuli / physically salient stimuli

Cite this article

Download citation ▾
Zile Wang, Qi Zhang, Yuxiang Hao, Shuangxing Xu. Different neural mechanisms for nonsalient trained stimuli and physically salient stimuli in visual processing. Psych Journal, 2024, 13(2): 227-241 DOI:10.1002/pchj.718

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Ahissar, M., & Hochstein, S. (2004). The reverse hierarchy theory of visual perceptual learning. Trends in Cognitive Sciences, 8(10), 457–464.

[2]

Aron, A. R. (2011). From reactive to proactive and selective control: Developing a richer model for stopping inappropriate responses. Biological Psychiatry, 69(12), e55–e68.

[3]

Bachman, M. D., Wang, L., Gamble, M. L., & Woldorff, M. G. (2020). Physical salience and value-driven salience operate through different neural mechanisms to enhance attentional selection. Journal of Neuroscience, 40(28), 5455–5464.

[4]

Bari, A., & Robbins, T. W. (2013). Inhibition and impulsivity: Behavioral and neural basis of response control. Progress in Neurobiology, 108, 44–79.

[5]

Brainard, D. H. (1997). The psychophysics toolbox. Spatial Vision, 10(4), 433–436.

[6]

Burra, N., & Kerzel, D. (2014). The distractor positivity (Pd) signals lowering of attentional priority: Evidence from event-related potentials and individual differences. Psychophysiology, 51(7), 685–696.

[7]

Comerchero, M. D., & Polich, J. (1999). P3a and P3b from typical auditory and visual stimuli. Clinical Neurophysiology, 110(1), 24–30.

[8]

Delorme, A., & Makeig, S. (2004). EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134(1), 9–21.

[9]

Dosher, B. A., & Lu, Z. L. (1998). Perceptual learning reflects external noise filtering and internal noise reduction through channel reweighting. Proceedings of the National Academy of Sciences, 95(23), 13988–13993.

[10]

Gaspar, J. M., Christie, G. J., Prime, D. J., Jolicoeur, P., & McDonald, J. J. (2016). Inability to suppress salient distractors predicts low visual working memory capacity. Proceedings of the National Academy of Sciences, 113(13), 3693–3698.

[11]

Gaspar, J. M., & McDonald, J. J. (2014). Suppression of salient objects prevents distraction in visual search. Journal of Neuroscience, 34(16), 5658–5666.

[12]

Gaspelin, N., Leonard, C. J., & Luck, S. J. (2017). Suppression of overt attentional capture by salient-but-irrelevant color singletons. Attention, Perception, & Psychophysics, 79(1), 45–62.

[13]

Gaspelin, N., & Luck, S. J. (2018). Combined electrophysiological and behavioral evidence for the suppression of salient distractors. Journal of Cognitive Neuroscience, 30(9), 1265–1280.

[14]

Geng, J. J., & Di Quattro, N. E. (2010). Attentional capture by a perceptually salient non-target facilitates target processing through inhibition and rapid rejection. Journal of Vision, 10(6), 5.

[15]

Hickey, C., McDonald, J. J., & Theeuwes, J. (2006). Electrophysiological evidence of the capture of visual attention. Journal of Cognitive Neuroscience, 18(4), 604–613.

[16]

Hopf, J. M., Voge, E., Woodman, G., Heinze, H. J., & Luck, S. J. (2002). Localizing visual discrimination processes in time and space. Journal of Neurophysiology, 88(4), 2088–2095.

[17]

Hu, L., Ding, Y., & Qu, Z. (2018). Perceptual learning induces active suppression of physically nonsalient shapes. Psychophysiology, 56(9), 1–17.

[18]

Kerzel, D., & Schönhammer, J. (2013). Salient stimuli capture attention and action. Attention, Perception, & Psychophysics, 75(8), 1633–1643.

[19]

Kiss, M., Grubert, A., Petersen, A., & Eimer, M. (2012). Attentional capture by salient distractors during visual search is determined by temporal task demands. Journal of Cognitive Neuroscience, 24(3), 749–759.

[20]

Kyllingsbæk, S., Schneider, W. X., & Bundesen, C. (2001). Automatic attraction of attention to former targets in visual displays of letters. Perception & Psychophysics, 63(1), 85–98.

[21]

Kyllingsbæk, S., Van Lommel, S., Sørensen, T. A., & Bundesen, C. (2014). Automatic attraction of visual attention by supraletter features of former target strings. Frontiers in Psychology, 5(NOV), 1–7.

[22]

Leber, A. B. (2010). Neural predictors of within-subject fluctuations in attentional control. Journal of Neuroscience, 30(34), 11458–11465.

[23]

Lin, Z., Lu, Z. L., & He, S. (2017). Decomposing experience-driven attention: Opposite attentional effects of previously predictive cues. Attention, Perception, and Psychophysics, 78(7), 2185–2198.

[24]

Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21.

[25]

Luck, S. J., & Hillyard, S. A. (1994a). Electrophysiological correlates of feature analysis during visual search. Psychophysiology, 31(3), 291–308.

[26]

Luck, S. J., & Hillyard, S. A. (1994b). Spatial filtering during visual search: Evidence from human electrophysiology. Journal of Experimental Psychology: Human Perception and Performance, 20(5), 1000–1014.

[27]

McNab, F., & Klingberg, T. (2008). Prefrontal cortex and basal ganglia control access to working memory. Nature Neuroscience, 11(1), 103–107.

[28]

Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10(4), 437–442.

[29]

Qi, S., Zeng, Q., Ding, C., & Li, H. (2013). Neural correlates of reward-driven attentional capture in visual search. Brain Research, 1532, 32–43.

[30]

Qu, Z., Hillyard, S. A., & Ding, Y. (2017). Perceptual learning induces persistent attentional capture by nonsalient shapes. Cerebral Cortex, 27(2), 1512–1523.

[31]

Sawaki, R., Geng, J. J., & Luck, S. J. (2012). A common neural mechanism for preventing and terminating the allocation of attention. Journal of Neuroscience, 32(31), 10725–10736.

[32]

Sawaki, R., & Luck, S. J. (2010). Capture versus suppression of attention by salient singletons: Electrophysiologcalevidence for an automatic attend-to-me signal. Attention, Perception, & Psychophysics, 72(6), 1455–1470.

[33]

Sawaki, R., & Luck, S. J. (2011). Active suppression of distractors that match the contents of visual working memory. Visual Cognition, 19(7), 956–972.

[34]

Schuster, D., Rivera, J., Sellers, B. C., Fiore, S. M., & Jentsch, F. (2013). Perceptual training for visual search. Ergonomics, 56(7), 1101–1115.

[35]

Serences, J. T., Shomstein, S., Leber, A. B., Golay, X., Egeth, H. E., & Yantis, S. (2005). Coordination of voluntary and stimulus-driven attentional control in human cortex. Psychological Science, 16(2), 114–122.

[36]

Theeuwes, J. (1992). Perceptual selectivity for color and form. Perception & Psychophysics, 51(6), 599–606.

[37]

Theeuwes, J. (1994). Stimulus-driven capture and attentional set: Selective search for color and visual abrupt onsets. Journal of Experimental Psychology: Human Perception and Performance, 20(4), 799–806.

[38]

Theeuwes, J. (2010). Top-down and bottom-up control of visual selection. Acta Psychologica, 135(2), 77–99.

[39]

Töllner, T., Müller, H. J., & Zehetleitner, M. (2012). Top-down dimensional weight set determines the capture of visual attention: Evidence from the PCN component. Cerebral Cortex, 22(7), 1554–1563.

[40]

Turatto, M., & Galfano, G. (2000). Color, form and luminance capture attention in visual search. Vision Research, 40(13), 1639–1643.

[41]

Turatto, M., & Galfano, G. (2001). Attentional capture by color without any relevant attentional set. Perception & Psychophysics, 63(2), 286–297.

[42]

Verleger, R., Jaśkowski, P., & Wascher, E. (2005). Evidence for an integrative role of P3b in linking reaction to perception. Journal of Psychophysiology, 19(3), 165–181.

[43]

Vogel, E. K., & Luck, S. J. (2000). The visual N1 component as an index of a discrimination process. Psychophysiology, 37(2), 190–203.

[44]

Wagenmakers, E. J., Marsman, M., Jamil, T., Ly, A., Verhagen, J., Love, J., Selker, R., Gronau, Q. F., Šmíra, M., Epskamp, S., Matzke, D., Rouder, J. N., & Morey, R. D. (2018). Bayesian inference for psychology. Part I: Theoretical advantages and practical ramifications. Psychonomic Bulletin and Review, 25(1), 35–57.

[45]

Wang, B., Samara, I., & Theeuwes, J. (2019). Statistical regularities bias overt attention. Attention, Perception, and Psychophysics, 81(6), 1813–1821.

[46]

Wasserstein, R. L., & Lazar, N. A. (2016). The ASA's statement on p-values: Context, process, and purpose. American Statistician, 70(2), 129–133.

[47]

Wetzels, R., Matzke, D., Lee, M. D., Rouder, J. N., Iverson, G. J., & Wagenmakers, E. J. (2011). Statistical evidence in experimental psychology: An empirical comparison using 855 t tests. Perspectives on Psychological Science, 6(3), 291–298.

[48]

Wykowska, A., & Schubö, A. (2010). On the temporal relation of top-down and bottom-up mechanisms during guidance of attention. Journal of Cognitive Neuroscience, 22(4), 640–654.

[49]

Yantis, S., & Jonides, J. (1984). Abrupt visual onsets and selective attention: Evidence from visual search. Journal of Experimental Psychology: Human Perception and Performance, 10(5), 601–621.

[50]

Zhang, J. Y., Zhang, G. L., Xiao, L. Q., Klein, S. A., Levi, D. M., & Yu, C. (2010). Rule-based learning explains visual perceptual learning and its specificity and transfer. Journal of Neuroscience, 30(37), 12323–12328.

RIGHTS & PERMISSIONS

2024 The Authors. PsyCh Journal published by Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

AI Summary AI Mindmap
PDF

150

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/