Research ReportEffects of inverting contour and features on processing for static and dynamic face perception: An MEG study
Research highlights
►There were different effects of inverting contour and features in face perception. ►Right fusiform activity was affected by inverting features in static face. ►Left fusiform activity was affected by disrupting spatial relation in static face. ►Right occipitotemporal activity was affected by inverting contour in dynamic face.
Introduction
In our daily lives, much information, for example, sex, familiarity, age, and race, is obtained from faces, so that face perception plays an important role in social communication. There have been many reports on face perception in psychology, for example, Bruce and Young (1986) proposed the “face recognition model.” In addition, recently, many human studies have used neuroimaging methods to examine face perception.
In studies with electroencephalography (EEG) (e.g., Bentin et al., 1996, George et al., 1996), a negative component is evoked by static faces peaking at approximately 170 ms and is termed as the N170. The N170 component is significantly earlier and larger for faces than other objects (for example, car) and may be specific to static face perception (e.g., Rossion and Jacques, 2008). Recently, studies using magnetoencephalography (MEG) were also reported, and M170, corresponding to N170 in EEG studies, was identified (e.g., Halgren et al., 2000, Liu et al., 2000, Taylor et al., 2001). For example, Halgren et al. (2000) concluded that the fusifrom gyrus may selectively encode faces at 165 ms. Using functional magnetic resonance imaging (fMRI), the fusiform face area (FFA) was activated by static faces in a selective fashion (e.g., Kanwisher et al., 1997, Kanwisher et al., 1999).
In EEG studies, the N170 is affected by inversion, i.e., the face inversion effect. In previous studies (Bentin et al., 1996, Honda et al., 2007, Itier and Taylor, 2004, Latinus and Taylor, 2006, Sagiv and Bentin, 2001, Watanabe et al., 2003, Watanabe et al., 2005), it was longer in latency and larger in amplitude for inverted faces than for upright faces. There were also MEG studies about face inversion effects (e.g., Watanabe et al., 2003). Watanabe et al. (2003) reported that the latency of the evoked component to inverted faces relative to upright faces were longer in the right hemisphere and shorter in the left. In addition, the N170 was longer for scrambled features (for example, eyes, nose and mouth), in which the spatial relation between the facial contour (for example, hair, chin and ears) and features was disrupted (George et al., 1996, Latinus and Taylor, 2006) than it was for upright faces. Therefore, many authors have speculated that the N170 reflects the holistic processing for static face perception, but this is still being debated (see Harris and Nakayama, 2007, Jacques and Rossion, 2009, Zion-Golumbic and Bentin, 2007).
Facial movements also play an important role in our daily lives, for example, in social communication and non-language communication. There have been many studies of facial movements, i.e., face dynamic perception. In an fMRI study, Puce et al. (1998) found that the superior temporal sulcus (STS), which is considered to be related to social information from the actions of others (Allison et al., 2000), was responsible for visual motion perception in addition to MT/V5.
Gaze direction has several roles in social communication and, moreover, is an important indicator of another individual's focus of attention (Kleinke, 1986). In monkeys, some neurons in the STS are sensitive to eye direction (Jellema et al., 2000, Perrett et al., 1992). In humans, there have been also many studies about gaze direction using EEG and MEG. Conty et al. (2007) studied evoked potentials (N170) in human adults in response to the apparent motion of gaze and showed that the perception of direct relative to averted gaze evoked a greater, later and longer lasting N170. In an ERP study to investigate the differences between children with and without autism, Senju et al. (2005) found that occipitotemporal negativity (N2) correlated with neural activity for processing gaze direction, which seems to be deviant in children with autism. Itier et al. (2007) investigated gaze processing and its interaction with head orientation when attention was directly directed explicitly to the direction of the gaze (Gaze task) or to the orientation of the head (Head task) of face stimuli and showed that the N170 component was larger for averted than direct gaze only when heads were viewed front-on.
In face perception, whether the ERP components evoked by static and dynamic face stimuli depend on context was investigated. For static face perception, the component peaking at approximately 180 ms evoked by eyes only (without context) was significantly longer than that evoked by a face (with context) (e.g., Watanabe et al., 1999). For dynamic face perception, an ERP study viewing eye movement showed that the inferior temporal region responded with larger N170s to averted eyes either in isolation or within the context of a face, that is, the brain activity evoked by eye movements was independent of facial context (Puce et al., 2000). On the other hand, our previous study showed that the occipitotemporal area, corresponding to MT/V5, was more activated by the dots movement mimicking the eye movement within facial contour (circle) and features (dots and line), which subjects perceived as a face, than a simple dots movement without facial contour and features, though movement modalities were the same (Miki et al., 2007), and concluded that the MT/V5 activity was affected by the facial contour and features.
Considering previous studies, we hypothesized that the perception of eye movements was mainly affected by information on facial contour and other features. In this study, therefore, we used faces with an inverted contour (hair and chin) and/or features (eyes, nose and mouth), which disrupted the spatial location of the contour and features, being different from previous studies, and investigated (1) how the brain activity related to static and dynamic face perception was modulated by inverting the contour and/or features of the human face and (2) what information within the face was important to the processing for static and dynamic face perception.
In this study, we used MEG, which has a higher temporal resolution than fMRI, positron emission tomography (PET) and near infrared spectroscopy (NIRS) and a higher spatial resolution than EEG, because we investigated the temporal aspect of the brain activity specific to static and dynamic face perception in detail. We compared cortical activities evoked by viewing an upright face (Upright contour and Upright features: U&U), an inverted face (Inverted contour and Inverted features: I&I), and a face in which the spatial relation between the contour and features was disrupted (Upright contour and Inverted features: U&I) (see Fig. 1). We focused on activity in the fusiform gyrus and MT/V5 for static and dynamic face perception, respectively, using apparent motion stimuli (e.g., Kaneoke et al., 1997, Miki et al., 2004, Miki et al., 2007, Watanabe et al., 2001, Watanabe et al., 2006).
Section snippets
MEG following S1 (static face stimuli)
In both hemispheres, we adopted the results from all 10 subjects whose dipoles were estimated in all conditions and fulfilled the criteria for further analysis. Fig. 2 shows the waveforms recorded from 204 gradiometers of one subject (Subject 1) following S1 onset (static face stimuli) in the U&U (Upright contour and Upright features) condition and the waveforms in all conditions at representative sensors, where the largest component was identified for the U&U condition in each occipital or
Effects on processing for static face perception
In our study, latency was significantly longer for U&I (Upright contour and Inverted features) and I&I (Inverted contour and Inverted features) than for U&U (Upright contour and Upright features) in the right fusiform area. In the U&U condition, the spatial relation among the facial features (eyes, nose and mouth) was intact and the features were upright, however, in the U&I and I&I conditions, the features were inverted but retained their spatial relation among them, though the contour (hair
Subjects
We studied 10 right-handed volunteers (three females and seven males) ranging in age from 24 to 47 (mean, 30.6) years with normal or corrected visual acuity. All subjects gave informed consent to participate in the experiment, which was approved by the Ethics Committee of the National Institute for Physiological Sciences, Okazaki, Japan.
Visual stimuli
We used the image of a female face to avoid any influence of sex, emotion, familiarity, attractiveness, etc., to determine the effect of inversion, though
Acknowledgments
This study was supported by a Grant-in-aid for Scientific Research on Innovative Areas, “Face perception and recognition”, by the Ministry of Education, Culture, Sports, Science and Technology, Japan to K.M. In addition, “Development of biomarker candidates for social behavior” carried out under Strategic Research Program for Brain Sciences, by the Ministry of Education, Culture, Sports, Science and Technology, Japan.
References (46)
- et al.
Social perception from visual cues: role of the STS region
Trends Cogn. Sci.
(2000) - et al.
Bugs and faces in the two visual fields: the analytic/holistic processing dichotomy and task sequencing
Cortex
(1982) - et al.
When eye creates the contact! ERP evidence for early dissociation between direct and averted gaze motion processing
Neuropsychologia
(2007) - et al.
MEG/EEG sources of the 170-ms response to faces are co-localized in the fusiform gyrus
Neuroimage
(2007) - et al.
Brain events related to normal and moderately scrambled faces
Brain Res. Cogn. Brain Res.
(1996) - et al.
Magnetoencephalographic characterization of dynamic brain activation: basic principles and methods of data collection and source analysis
- et al.
Explicit versus implicit gaze processing assessed by ERPs
Brain Res.
(2007) - et al.
Neural representation for the perception of the intentionality of actions
Brain Cogn.
(2000) - et al.
Face processing stage: impact of difficulty and the separation of effect
Brain Res.
(2006) - et al.
Effects of face configuration and features on early occipitotemporal activity when viewing eye movement
Neuroimage
(2007)
Magnetoencephalographic study of occipitotemporal activity elicited by viewing mouth movements
Clin. Neurophysiol.
Does physical interstimulus variance account for early electrophysiological face sensitive responses in the human brain? Ten lessons on the N170
Neuroimage
Deviant gaze processing in children with autism: an ERP study
Neuropsychologia
Magnetoencephalographic evidence of early processing of direction of gaze in humans
Neurosci. Lett.
Gating of somatosensory evoked magnetic fields during the preparatory period of self-initiated finger movement
Neuroimage
Human MT/V5 activity on viewing eye gaze changes in others: a magnetoencephalographic study
Brain Res.
The spatiotemporal dynamics of the face inversion effect: a magneto- and electro-encephalographic study
Neuroscience
Occipitotemporal activity elicited by viewing eye movements: a magnetoencephalographic study
Neuroimage
Understanding face recognition
Br. J. Psychol.
Electrophysiological studies of face perception in humans
J. Cogn. Neurosci.
Face recognition and lipreading. A neurological dissociation
Brain
A new anatomical landmark for reliable identification of human area V5/MT: a quantitative analysis of sulcal patterning
Cereb. Cortex
Cognitive response profile of the human fusiform face area as determined by MEG
Cereb. Cortex
Cited by (10)
Multilevel alterations in the processing of audio-visual emotion expressions in autism spectrum disorders
2013, NeuropsychologiaCitation Excerpt :Indeed, facial movements have been shown to enrich emotional expression, contributing to its identification and playing an important role in the perception of its intensity (Ambadar, Schooler, & Cohn, 2005; Biele & Grabowska, 2006). Also, neuroimaging studies have shown that the brain regions involved in the processing of facial affect, such as the posterior superior temporal sulcus (pSTS), the amygdala and the insula, respond differently to dynamic than to static emotional expressions (Haxby, Hoffman, & Gobbini, 2000, 2002; LaBar, Crupain, Voyvodic, & McCarthy, 2003; Miki, Takeshima, Watanabe, Honda, & Kakigi, 2011). Moreover, only a few studies explored the processing of affective vocalizations in ASD (Baker, Montgomery, & Abramson, 2010; Hall, Szechtman, & Nahmias, 2003; Loveland et al., 2008; Wang, Lee, Sigman, & Dapretto, 2007).
Human Face Perception Using Electroencephalography and Magnetoencephalography
2022, Frontiers in PhysiologyModulation of neural oscillatory activity during dynamic face processing
2018, Journal of Cognitive NeuroscienceRelations Between Nonverbal and Verbal Social Cognitive Skills and Complex Social Behavior in Children and Adolescents with Autism
2016, Journal of Abnormal Child PsychologyBasic function
2016, Clinical Applications of MagnetoencephalographySpatio-temporal dynamics and laterality effects of face inversion, feature presence and configuration, and face outline
2014, Frontiers in Human Neuroscience