Elsevier

Brain Research

Volume 1383, 6 April 2011, Pages 230-241
Brain Research

Research Report
Effects of inverting contour and features on processing for static and dynamic face perception: An MEG study

https://doi.org/10.1016/j.brainres.2011.01.091Get rights and content

Abstract

We investigated the effects of inverting facial contour (hair and chin) and features (eyes, nose and mouth) on processing for static and dynamic face perception using magnetoencephalography (MEG). We used apparent motion, in which the first stimulus (S1) was replaced by a second stimulus (S2) with no interstimulus interval and subjects perceived visual motion, and presented three conditions as follows: (1) U&U: Upright contour and Upright features, (2) U&I: Upright contour and Inverted features, and (3) I&I: Inverted contour and Inverted features. In static face perception (S1 onset), the peak latency of the fusiform area's activity, which was related to static face perception, was significantly longer for U&I and I&I than for U&U in the right hemisphere and for U&I than for U&U and I&I in the left. In dynamic face perception (S2 onset), the strength (moment) of the occipitotemporal area's activity, which was related to dynamic face perception, was significantly larger for I&I than for U&U and U&I in the right hemisphere, but not the left. These results can be summarized as follows: (1) in static face perception, the activity of the right fusiform area was more affected by the inversion of features while that of the left fusiform area was more affected by the disruption of the spatial relation between the contour and features, and (2) in dynamic face perception, the activity of the right occipitotemporal area was affected by the inversion of the facial contour.

Research highlights

►There were different effects of inverting contour and features in face perception. ►Right fusiform activity was affected by inverting features in static face. ►Left fusiform activity was affected by disrupting spatial relation in static face. ►Right occipitotemporal activity was affected by inverting contour in dynamic face.

Introduction

In our daily lives, much information, for example, sex, familiarity, age, and race, is obtained from faces, so that face perception plays an important role in social communication. There have been many reports on face perception in psychology, for example, Bruce and Young (1986) proposed the “face recognition model.” In addition, recently, many human studies have used neuroimaging methods to examine face perception.

In studies with electroencephalography (EEG) (e.g., Bentin et al., 1996, George et al., 1996), a negative component is evoked by static faces peaking at approximately 170 ms and is termed as the N170. The N170 component is significantly earlier and larger for faces than other objects (for example, car) and may be specific to static face perception (e.g., Rossion and Jacques, 2008). Recently, studies using magnetoencephalography (MEG) were also reported, and M170, corresponding to N170 in EEG studies, was identified (e.g., Halgren et al., 2000, Liu et al., 2000, Taylor et al., 2001). For example, Halgren et al. (2000) concluded that the fusifrom gyrus may selectively encode faces at 165 ms. Using functional magnetic resonance imaging (fMRI), the fusiform face area (FFA) was activated by static faces in a selective fashion (e.g., Kanwisher et al., 1997, Kanwisher et al., 1999).

In EEG studies, the N170 is affected by inversion, i.e., the face inversion effect. In previous studies (Bentin et al., 1996, Honda et al., 2007, Itier and Taylor, 2004, Latinus and Taylor, 2006, Sagiv and Bentin, 2001, Watanabe et al., 2003, Watanabe et al., 2005), it was longer in latency and larger in amplitude for inverted faces than for upright faces. There were also MEG studies about face inversion effects (e.g., Watanabe et al., 2003). Watanabe et al. (2003) reported that the latency of the evoked component to inverted faces relative to upright faces were longer in the right hemisphere and shorter in the left. In addition, the N170 was longer for scrambled features (for example, eyes, nose and mouth), in which the spatial relation between the facial contour (for example, hair, chin and ears) and features was disrupted (George et al., 1996, Latinus and Taylor, 2006) than it was for upright faces. Therefore, many authors have speculated that the N170 reflects the holistic processing for static face perception, but this is still being debated (see Harris and Nakayama, 2007, Jacques and Rossion, 2009, Zion-Golumbic and Bentin, 2007).

Facial movements also play an important role in our daily lives, for example, in social communication and non-language communication. There have been many studies of facial movements, i.e., face dynamic perception. In an fMRI study, Puce et al. (1998) found that the superior temporal sulcus (STS), which is considered to be related to social information from the actions of others (Allison et al., 2000), was responsible for visual motion perception in addition to MT/V5.

Gaze direction has several roles in social communication and, moreover, is an important indicator of another individual's focus of attention (Kleinke, 1986). In monkeys, some neurons in the STS are sensitive to eye direction (Jellema et al., 2000, Perrett et al., 1992). In humans, there have been also many studies about gaze direction using EEG and MEG. Conty et al. (2007) studied evoked potentials (N170) in human adults in response to the apparent motion of gaze and showed that the perception of direct relative to averted gaze evoked a greater, later and longer lasting N170. In an ERP study to investigate the differences between children with and without autism, Senju et al. (2005) found that occipitotemporal negativity (N2) correlated with neural activity for processing gaze direction, which seems to be deviant in children with autism. Itier et al. (2007) investigated gaze processing and its interaction with head orientation when attention was directly directed explicitly to the direction of the gaze (Gaze task) or to the orientation of the head (Head task) of face stimuli and showed that the N170 component was larger for averted than direct gaze only when heads were viewed front-on.

In face perception, whether the ERP components evoked by static and dynamic face stimuli depend on context was investigated. For static face perception, the component peaking at approximately 180 ms evoked by eyes only (without context) was significantly longer than that evoked by a face (with context) (e.g., Watanabe et al., 1999). For dynamic face perception, an ERP study viewing eye movement showed that the inferior temporal region responded with larger N170s to averted eyes either in isolation or within the context of a face, that is, the brain activity evoked by eye movements was independent of facial context (Puce et al., 2000). On the other hand, our previous study showed that the occipitotemporal area, corresponding to MT/V5, was more activated by the dots movement mimicking the eye movement within facial contour (circle) and features (dots and line), which subjects perceived as a face, than a simple dots movement without facial contour and features, though movement modalities were the same (Miki et al., 2007), and concluded that the MT/V5 activity was affected by the facial contour and features.

Considering previous studies, we hypothesized that the perception of eye movements was mainly affected by information on facial contour and other features. In this study, therefore, we used faces with an inverted contour (hair and chin) and/or features (eyes, nose and mouth), which disrupted the spatial location of the contour and features, being different from previous studies, and investigated (1) how the brain activity related to static and dynamic face perception was modulated by inverting the contour and/or features of the human face and (2) what information within the face was important to the processing for static and dynamic face perception.

In this study, we used MEG, which has a higher temporal resolution than fMRI, positron emission tomography (PET) and near infrared spectroscopy (NIRS) and a higher spatial resolution than EEG, because we investigated the temporal aspect of the brain activity specific to static and dynamic face perception in detail. We compared cortical activities evoked by viewing an upright face (Upright contour and Upright features: U&U), an inverted face (Inverted contour and Inverted features: I&I), and a face in which the spatial relation between the contour and features was disrupted (Upright contour and Inverted features: U&I) (see Fig. 1). We focused on activity in the fusiform gyrus and MT/V5 for static and dynamic face perception, respectively, using apparent motion stimuli (e.g., Kaneoke et al., 1997, Miki et al., 2004, Miki et al., 2007, Watanabe et al., 2001, Watanabe et al., 2006).

Section snippets

MEG following S1 (static face stimuli)

In both hemispheres, we adopted the results from all 10 subjects whose dipoles were estimated in all conditions and fulfilled the criteria for further analysis. Fig. 2 shows the waveforms recorded from 204 gradiometers of one subject (Subject 1) following S1 onset (static face stimuli) in the U&U (Upright contour and Upright features) condition and the waveforms in all conditions at representative sensors, where the largest component was identified for the U&U condition in each occipital or

Effects on processing for static face perception

In our study, latency was significantly longer for U&I (Upright contour and Inverted features) and I&I (Inverted contour and Inverted features) than for U&U (Upright contour and Upright features) in the right fusiform area. In the U&U condition, the spatial relation among the facial features (eyes, nose and mouth) was intact and the features were upright, however, in the U&I and I&I conditions, the features were inverted but retained their spatial relation among them, though the contour (hair

Subjects

We studied 10 right-handed volunteers (three females and seven males) ranging in age from 24 to 47 (mean, 30.6) years with normal or corrected visual acuity. All subjects gave informed consent to participate in the experiment, which was approved by the Ethics Committee of the National Institute for Physiological Sciences, Okazaki, Japan.

Visual stimuli

We used the image of a female face to avoid any influence of sex, emotion, familiarity, attractiveness, etc., to determine the effect of inversion, though

Acknowledgments

This study was supported by a Grant-in-aid for Scientific Research on Innovative Areas, “Face perception and recognition”, by the Ministry of Education, Culture, Sports, Science and Technology, Japan to K.M. In addition, “Development of biomarker candidates for social behavior” carried out under Strategic Research Program for Brain Sciences, by the Ministry of Education, Culture, Sports, Science and Technology, Japan.

References (46)

  • K. Miki et al.

    Magnetoencephalographic study of occipitotemporal activity elicited by viewing mouth movements

    Clin. Neurophysiol.

    (2004)
  • B. Rossion et al.

    Does physical interstimulus variance account for early electrophysiological face sensitive responses in the human brain? Ten lessons on the N170

    Neuroimage

    (2008)
  • A. Senju et al.

    Deviant gaze processing in children with autism: an ERP study

    Neuropsychologia

    (2005)
  • M.J. Taylor et al.

    Magnetoencephalographic evidence of early processing of direction of gaze in humans

    Neurosci. Lett.

    (2001)
  • T. Wasaka et al.

    Gating of somatosensory evoked magnetic fields during the preparatory period of self-initiated finger movement

    Neuroimage

    (2003)
  • S. Watanabe et al.

    Human MT/V5 activity on viewing eye gaze changes in others: a magnetoencephalographic study

    Brain Res.

    (2006)
  • S. Watanabe et al.

    The spatiotemporal dynamics of the face inversion effect: a magneto- and electro-encephalographic study

    Neuroscience

    (2003)
  • S. Watanabe et al.

    Occipitotemporal activity elicited by viewing eye movements: a magnetoencephalographic study

    Neuroimage

    (2001)
  • V. Bruce et al.

    Understanding face recognition

    Br. J. Psychol.

    (1986)
  • S. Bentin et al.

    Electrophysiological studies of face perception in humans

    J. Cogn. Neurosci.

    (1996)
  • R. Campbell et al.

    Face recognition and lipreading. A neurological dissociation

    Brain

    (1986)
  • S.O. Dumoulin et al.

    A new anatomical landmark for reliable identification of human area V5/MT: a quantitative analysis of sulcal patterning

    Cereb. Cortex

    (2000)
  • E. Halgren et al.

    Cognitive response profile of the human fusiform face area as determined by MEG

    Cereb. Cortex

    (2000)
  • Cited by (10)

    • Multilevel alterations in the processing of audio-visual emotion expressions in autism spectrum disorders

      2013, Neuropsychologia
      Citation Excerpt :

      Indeed, facial movements have been shown to enrich emotional expression, contributing to its identification and playing an important role in the perception of its intensity (Ambadar, Schooler, & Cohn, 2005; Biele & Grabowska, 2006). Also, neuroimaging studies have shown that the brain regions involved in the processing of facial affect, such as the posterior superior temporal sulcus (pSTS), the amygdala and the insula, respond differently to dynamic than to static emotional expressions (Haxby, Hoffman, & Gobbini, 2000, 2002; LaBar, Crupain, Voyvodic, & McCarthy, 2003; Miki, Takeshima, Watanabe, Honda, & Kakigi, 2011). Moreover, only a few studies explored the processing of affective vocalizations in ASD (Baker, Montgomery, & Abramson, 2010; Hall, Szechtman, & Nahmias, 2003; Loveland et al., 2008; Wang, Lee, Sigman, & Dapretto, 2007).

    • Basic function

      2016, Clinical Applications of Magnetoencephalography
    View all citing articles on Scopus
    View full text