Skip to main content
Article

Gender Word Semantic Satiation Inhibits Facial Gender Information Processing

Evidence From Behavior and Event-Related Potentials

Published Online:https://doi.org/10.1027/0269-8803/a000274

Abstract. In order to explore the time course of the influence of gender words semantic satiation on facial gender information processing, the semantic satiation paradigm was used to induce semantic satiation by presenting Chinese gender words “男, 女 (Male, Female)” for a long duration (25 s), with conjunction words “及(And), 且(Moreover)” served as the baseline (the Chinese words and their English translations do not completely equal in terms of pronunciation, form, and sense). Participants were asked to judge whether the two simultaneously presented faces (Experiment 1) or two successively presented faces (Experiment 2) were of the same gender. The results of Experiment 1 showed that the response time in semantic satiation condition was significantly longer than that of the baseline condition. The event-related potential (ERP) results of Experiment 2 showed that the peak amplitude of P1 component in semantic satiation condition was significantly smaller than that of the baseline condition in the early stage of face processing; N170, a specific component of face perception, in semantic satiation condition was significantly larger than that of the baseline condition. The average amplitude of LPC in semantic satiation condition was significantly smaller than that of the baseline condition. This study shows that facial gender information processing is affected by its semantic contextual information. The inhibition effect of gender word semantic satiation on facial gender information processing starts at the attention orientation stage, then continues to the face structural encoding stage, and eventually ends at the advanced cognitive response stage.

References

  • Amihai, I., Deouell, L., & Bentin, S. (2011). Conscious awareness is necessary for processing race and gender information from faces. Consciousness and Cognition, 20(2), 269–279. https://doi.org/10.1016/j.concog.2010.08.004 First citation in articleCrossrefGoogle Scholar

  • Ashley, V., Vuilleumier, P., & Swick, D. (2004). Time course and specificity of event-related potentials to emotional expressions. Neuroreport, 15(1), 211–216. https://doi.org/10.1097/00001756-200401190-00041 First citation in articleCrossrefGoogle Scholar

  • Aviezer, H., Bentin, S., Dudarev, V., & Hassin, R. R. (2011). The automaticity of emotional face-context integration. Emotion, 11(6), 1406–1414. https://doi.org/10.1037/a0023578 First citation in articleCrossrefGoogle Scholar

  • Barrett, L. F., Lindquist, K. A., & Gendron, M. (2007). Language as context for the perception of emotion. Trends in Cognitive Sciences, 11(8), 327–332. https://doi.org/10.1016/j.tics.2007.06.003 First citation in articleCrossrefGoogle Scholar

  • Barrett, L. F., Mesquita, B., & Gendron, M. (2011). Context in emotion perception. Current Directions in Psychological Science, 20(5), 286–290. https://doi.org/10.1177/0963721411422522 First citation in articleCrossrefGoogle Scholar

  • Bayer, M., Sommer, W., & Schacht, A. (2012). P1 and beyond: Functional separation of multiple emotion effects in word recognition. Psychophysiology, 49(7), 959–969. https://doi.org/10.1111/j.1469-8986.2012.01381.x First citation in articleCrossrefGoogle Scholar

  • Bentin, S., Allison, T., Puce, A., Perez, E., & Mccarthy, G. (1996). Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience, 8(6), 551–565. https://doi.org/10.1162/jocn.1996.8.6.551 First citation in articleCrossrefGoogle Scholar

  • Blair, I. V. (2002). The malleability of automatic stereotypes and prejudice. Personality & Social Psychology Review, 6(3), 242–261. https://doi.org/10.1207/S15327957PSPR0603_8 First citation in articleCrossrefGoogle Scholar

  • Bradley, M. M., Greenwald, M. K., Petry, M. C., & Lang, P. J. (1992). Remembering pictures: Pleasure and arousal in memory. Journal of Experimental Psychology Learning Memory & Cognition, 18(2), 379–390. https://doi.org/10.1037/0278-7393.18.2.379 First citation in articleCrossrefGoogle Scholar

  • Bruce, V., & Young, A. (1986). Understanding face recognition. British Journal of Psychology, 77(3), 305–327. https://doi.org/10.1111/j.2044-8295.1986.tb02199.x First citation in articleCrossrefGoogle Scholar

  • Cuthbert, B. N., Schupp, H. T., Bradley, M. M., Birbaumer, N., & Lang, P. J. (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95–111. https://doi.org/10.1016/s0301-0511(99)00044-7 First citation in articleCrossrefGoogle Scholar

  • Davis, T. M., & Jerger, J. (2014). The effect of middle age on the late positive component of the auditory event-related potential. Journal of the American Academy of Audiology, 25(2), 199–209. https://doi.org/10.3766/jaaa.25.2.8 First citation in articleCrossrefGoogle Scholar

  • Eastwood, J. D., Smilek, D., & Merikle, P. M. (2001). Differential attentional guidance by unattended faces expressing positive and negative emotion. Perception & Psychophysics, 63(6), 1004–1013. https://doi.org/10.3758/bf03194519 First citation in articleCrossrefGoogle Scholar

  • Eimer, M. (2000). Event-related brain potentials distinguish processing stages involved in face perception and recognition. Clinical Neurophysiology, 111(4), 694–705. https://doi.org/10.1016/s1388-2457(99)00285-0 First citation in articleCrossrefGoogle Scholar

  • Esposito, N. J., & Pelton, L. H. (1971). Review of the measurement of semantic satiation. Psychological Bulletin, 75(5), 330–346. https://doi.org/10.1037/h0031001 First citation in articleCrossrefGoogle Scholar

  • Ferree, T. C., Luu, P., Russell, G. S., & Tucker, D. M. (2001). Scalp electrode impedance, infection risk, and EEG data quality. Clinical Neurophysiology, 112(3), 536–544. https://doi.org/10.1016/s1388-2457(00)00533-2 First citation in articleCrossrefGoogle Scholar

  • Fischler, I., & Bradley, M. (2006). Event-related potential studies of language and emotion: Words, phrases, and task effects. Progress in Brain Research, 156, 185–203. https://doi.org/10.1016/S0079-6123(06)56009-1 First citation in articleCrossrefGoogle Scholar

  • Freeman, J. B., Ambady, N., & Holcomb, P. J. (2010). The face-sensitive N170 encodes social category information. Neuroreport, 21(1), 24–28. https://doi.org/10.1097/WNR.0b013e3283320d54 First citation in articleCrossrefGoogle Scholar

  • Frühholz, S., Fehr, T., & Herrmann, M. (2009). Early and late temporo-spatial effects of contextual interference during perception of facial affect. International Journal of Psychophysiology, 74(1), 1–13. https://doi.org/10.1016/j.ijpsycho.2009.05.010 First citation in articleCrossrefGoogle Scholar

  • Gendron, M., Lindquist, K. A., Barsalou, L., & Barrett, L. F. (2012). Emotion words shape emotion percepts. Emotion, 12(2), 314–325. https://doi.org/10.1037/a0026007 First citation in articleCrossrefGoogle Scholar

  • Gong, X., Huang, Y. X., Yan, W., & Luo, Y. J. (2011). Revision of the Chinese facial affective picture system. Chinese Mental Health Journal, 25(1), 40–46. https://doi.org/10.3969/j.issn.1000-6729.2011.01.011 First citation in articleGoogle Scholar

  • Guillaume, C., Guillery-Girard, B., Chaby, L., Lebreton, K., Hugueville, L., Eustache, F., & Fiori, N. (2009). The time course of repetition effects for familiar faces and objects: An ERP study. Brain Research, 1248, 149–161. https://doi.org/10.1016/j.brainres.2008.10.069 First citation in articleCrossrefGoogle Scholar

  • Hillyard, S. A., Vogel, E. K., & Luck, S. J. (1998). Sensory gain control (amplification) as a mechanism of selective attention: Electrophysiological and neuroimaging evidence. Philosophical Transactions Biological Sciences, 353(1373), 1257–1270. https://doi.org/10.1098/rstb.1998.0281 First citation in articleCrossrefGoogle Scholar

  • Ito, T. A., & Urland, G. R. (2005). The influence of processing objectives on the perception of faces: An ERP study of race and gender perception. Cognitive Affective & Behavioral Neuroscience, 5(1), 21–36. https://doi.org/10.3758/cabn.5.1.21 First citation in articleCrossrefGoogle Scholar

  • Jiang, Z. Q., Qu, Y. H., Xiao, Y. L., Wu, Q., Xia, L. K., Li, W. H., & Liu, Y. (2016). Comparison of affective and semantic priming in different SOA. Cognitive Processing, 17(4), 357–375. https://doi.org/10.1007/s10339-016-0771-8 First citation in articleCrossrefGoogle Scholar

  • Juottonen, K., Revonsuo, A., & Lang, H. (1996). Dissimilar age influences on two ERP waveforms (LPC and N400) reflecting semantic context effect. Cognitive Brain Research, 4(2), 99–107. https://doi.org/10.1016/0926-6410(96)00022-5 First citation in articleCrossrefGoogle Scholar

  • Kasschau, R. A. (1969). Semantic satiation as a function of duration of repetition and initial meaning intensity. Journal of Verbal Learning & Verbal Behavior, 8(1), 36–42. https://doi.org/10.1016/S0022-5371(69)80008-3 First citation in articleCrossrefGoogle Scholar

  • Kecskés-Kovács, K., Sulykos, I., & Czigler, I. (2013). Is it a face of a woman or a man? Visual mismatch negativity is sensitive to gender category. Frontiers in Human Neuroscience, 7, Article 532. https://doi.org/10.3389/fnhum.2013.00532 First citation in articleCrossrefGoogle Scholar

  • Kounios, J., Kotz, S. A., & Holcomb, P. J. (2000). On the locus of the semantic satiation effect: Evidence from event-related brain potentials. Memory & Cognition, 28(8), 1366–1377. https://doi.org/10.3758/bf03211837 First citation in articleCrossrefGoogle Scholar

  • Küper, K., & Heil, M. (2009). Electrophysiology reveals semantic priming at a short SOA irrespective of depth of prime processing. Neuroscience Letters, 453(2), 107–111. https://doi.org/10.1016/j.neulet.2009.02.013 First citation in articleCrossrefGoogle Scholar

  • Lambert, W. E., & Jakobovits, L. A. (1961). Verbal satiation and changes in the intensity of meaning. Journal of Experimental Psychology, 60(6), 376–383. https://doi.org/10.1037/h0045624 First citation in articleCrossrefGoogle Scholar

  • Lewis, M. B., & Ellis, H. D. (2000). Satiation in name and face recognition. Memory & Cognition, 28(5), 783–788. https://doi.org/10.3758/bf03198413 First citation in articleCrossrefGoogle Scholar

  • Lindquist, K. A., Barrett, L. F., Bliss-Moreau, E., & Russell, J. A. (2006). Language and the perception of emotion. Emotion, 6(1), 125–138. https://doi.org/10.1037/1528-3542.6.1.125 First citation in articleCrossrefGoogle Scholar

  • Lindquist, K. A., & Gendron, M. (2013). What’s in a word? Language constructs emotion perception. Emotion Review, 5(1), 66–71. https://doi.org/10.1177/1754073912451351 First citation in articleCrossrefGoogle Scholar

  • Liu, C. W., Liu, Y., Iqbal, Z., Li, W. H., Lv, B., & Jiang, Z. Q. (2017). Symmetrical and asymmetrical interactions between facial expressions and gender information in face perception. Frontiers in Psychology, 8, Article 1383. https://doi.org/10.3389/fpsyg.2017.01383 First citation in articleCrossrefGoogle Scholar

  • Macrae, C. N., Bodenhausen, G. V., Milne, A. B., Thorn, T. M. J., & Castelli, L. (1997). On the activation of social stereotypes: The moderating role of processing objectives. Journal of Experimental Social Psychology, 33(5), 471–489. https://doi.org/10.1006/jesp.1997.1328 First citation in articleCrossrefGoogle Scholar

  • Poncet, F., Baudouin, J. Y., Dzhelyova, M. P., Rossion, B., & Leleu, A. (2019). Rapid and automatic discrimination between facial expressions in the human brain. Neuropsychologia, 129, 47–55. https://doi.org/10.1016/j.neuropsychologia.2019.03.006 First citation in articleCrossrefGoogle Scholar

  • Pourtois, G., Schettino, A., & Vuilleumier, P. (2013). Brain mechanisms for emotional influences on perception and attention: What is magic and what is not. Biological Psychology, 92(3), 492–512. https://doi.org/10.1016/j.biopsycho.2012.02.007 First citation in articleCrossrefGoogle Scholar

  • Rakić, T., Steffens, M. C., & Wiese, H. (2018). Same-gender distractors are not so easy to reject: ERP evidence of gender categorization. Cognitive Affective & Behavioral Neuroscience, 18(5), 825–836. https://doi.org/10.3758/s13415-018-0607-3 First citation in articleCrossrefGoogle Scholar

  • Reddy, L., Wilken, P., & Koch, C. (2004). Face-gender discrimination is possible in the near-absence of attention. Journal of Vision, 4(2), 106–117. https://doi.org/10.1167/4.2.4 First citation in articleCrossrefGoogle Scholar

  • Stenberg, G., Wiking, S., & Dahl, M. (1998). Judging words at face value: Interference in a word processing task reveals automatic processing of affective facial expressions. Cognition & Emotion, 12(6), 755–782. https://doi.org/10.1080/026999398379420 First citation in articleCrossrefGoogle Scholar

  • Wiese, H., Schweinberger, S. R., & Neumann, M. F. (2008). Perceiving age and gender in unfamiliar faces: Brain potential evidence for implicit and explicit person categorization. Psychophysiology, 45(6), 957–969. https://doi.org/10.1111/j.1469-8986.2008.00707.x First citation in articleCrossrefGoogle Scholar

  • Wieser, M. J., & Brosch, T. (2012). Faces in context: A review and systematization of contextual influences on affective face processing. Frontiers in Psychology, 3, Article 471. https://doi.org/10.3389/fpsyg.2012.00471 First citation in articleCrossrefGoogle Scholar

  • Xu, Q., Yang, Y., Tan, Q., & Zhang, L. (2017). Facial expressions in context: Electrophysiological correlates of the emotional congruency of facial expressions and background scenes. Frontiers in Psychology, 8, Article 2175. https://doi.org/10.3389/fpsyg.2017.02175 First citation in articleCrossrefGoogle Scholar