Face in profile view reduces perceived facial expression intensity: An eye-tracking study
Introduction
Facial expressions of emotion provide crucial visual cues for us to understand other people's emotional state and intention. The ability to recognize an individual's expression accurately and quickly, whilst at the same time assessing its intensity, plays a crucial role in our social communication and even survival (e.g., McFarland et al., 2013). Classical studies by Ekman and colleagues have suggested that each of six common facial expressions, such as happy, sad, fear, anger, disgust and surprise, represents one of our typical emotional states, is associated with distinctive pattern of facial muscle movements and is culturally similar (universal) among humans (Ekman and Friesen, 1976, Ekman and Rosenberg, 2005; see also Jack, Blais, Scheepers, Schyns, & Caldara, 2009). Our perception of facial expressions of emotion is therefore likely to be categorical, as we have a finite set of predefined expression classes and this category knowledge influences the perception (Ekman & Rosenberg, 2005). Similar to categorical colour and object perception, expression perception also demonstrates between-category advantage (Goldstone, 1994) with enhanced performance (e.g., accuracy, reaction time, discriminability) for discriminating expressive faces which span the categorical boundary compared to faces which do not cross the boundary (Fugate, 2013). An alternative to the categorical model is the continuous or dimensional model (Russell, 2003) in which each emotion is represented as a feature vector (e.g., pleasure–displeasure) in a multidimensional space given by some characteristics common to all emotions (e.g., arousal and valence). It is relatively easy for this model to justify the perception of less common expressions which share affective information, such as shame and confusion.
Considering that faces often appear under very different viewing conditions (e.g., brightness, viewing angle, or viewing distance), an invariant face representation in our visual system (within given limits) would be useful for efficient face perception and advantageous to our social interactions. Indeed, several recent psychological studies have demonstrated that varying resolution of face images (up to 10 × 15 pixels in which almost no useful local facial information is left for visual analysis; Du & Martinez, 2011), and manipulating presented face size to mimic viewing distance in typical social interactions (ranging from arm's length to 5 m; Guo, 2013), or changing face viewpoint from frontal view to profile view (Kleck and Mendolia, 1990, Matsumoto and Hwang, 2011; see also Hess et al., 2007, Skowronski et al., 2014) had little impact on expression categorization accuracy for the six common facial expressions, suggesting an invariant facial expression judgment under different viewing conditions.
These studies, however, only measured expression identification accuracy. It is unclear to what extent the perceived expression intensity is affected by viewing perspectives, such as from different viewing angles or viewpoints. Previous studies have clearly shown that different facial musculature patterns are associated with different facial expressions (Ekman & Friesen, 1976), and different local facial features transmit diagnostic information in recognizing different expressions (e.g., the eyes and mouth contain crucial cues for detecting angry and happy expressions, respectively) (Calvo and Nummenmaa, 2008, Smith et al., 2005). As the visible area of these facial musculature patterns and local facial features will vary according to horizontal viewing angles (e.g., frontal face may provide greater opportunity than profile face for detecting the lowering of inner eyebrows associated with angry expression), the perceived expression intensity could be sensitive to viewpoints.
Additionally, the expresser's head position can indicate focus of attention and consequently influence the interpretation of expressions in social context (Hess et al., 2007). For example, in comparison with profile view, we might be more sensitive to an angry face in frontal view because the associated aggression is more direct and consequently has a greater and imminent threat value to the viewer. Indeed, brain imaging studies have observed greater neural responses in amygdala to angry faces directed towards the observer compared with away (N'Diaye et al., 2009, Sato et al., 2004). Furthermore, given that different expressions could be associated with different functional value (e.g., angry and happy faces could signal threat and approachability, respectively) (Becker et al., 2011, Hess et al., 2007), it is plausible they may show a different degree of vulnerability to viewpoints.
Taken together, the reported invariant facial expression perception could be due to categorical judgement of emotions, as the perceived facial structure differences across viewpoints may only affect expression intensity judgement rather than expression categorization. Alternatively, if facial expression perception can be accounted for by the continuous model (Russell, 2003) in which the expression intensity is intrinsically defined in the representation of emotion type, then both expression recognition accuracy and perceived expression intensity might be influenced by viewpoints, as expressive cues from local facial features could become ambiguous at some viewing angles (e.g., profile view) for both expression categorization and intensity judgement. This possibility will be systematically examined in this study.
At a close social distance, the faces falling in our visual field are large enough to elicit saccadic eye movements that provide a sensitive and real-time measure of visual processing. Coinciding with our behavioural sensitivity and expertise in processing facial expressions, our gaze pattern could reflect expert cognitive strategies for extracting diagnostic facial information in expression perception. When using full frontal expressive faces as stimuli, recent eye-tracking studies have observed that when categorizing facial expressions, participants tended to adopt a ‘holistic’ viewing strategy to integrate featural information from key internal facial features into a single representation of the whole face (Guo, 2012). Specifically, smaller faces would attract a stronger fixation bias to the central face region (e.g., nose area) to efficiently gather facial cues from surrounding features (e.g., eyes and mouth). For larger faces, if individual facial features were large enough to attract direction gaze, people would scan all key internal facial features (i.e. eyes, nose, and mouth) to extract and integrate expressive featural cues in order to reliably decode facial affects (Guo, 2012, Guo, 2013), but looked more often at local facial regions that are most characteristic for each facial expression, such as mouth in happy faces and eyes in angry faces (Eisenbarth and Alpers, 2011, Guo, 2012, Jack et al., 2009). It is unclear, however, whether this holistic but also expression-sensitive gaze pattern would also be viewpoint-invariant. If the holistic viewing behaviour is part of generic scanning strategy for general face processing (Guo, 2013), then viewpoint should not qualitatively affect our gaze distribution at key facial features. On the other hand, if the gaze allocation is mainly determined by current available local information (Eisenbarth & Alpers, 2011), then viewpoint-induced changes in visible area of internal facial features should systematically influence the amount of fixations directed at these features.
In this exploratory eye-tracking study we presented six common facial expressions of emotion (happy, sad, fearful, angry, disgusted, and surprised) at three different viewing angles (frontal, mid-profile, and profile view), and aimed to examine to what extent observers' expression categorization accuracy, perceived expression intensity, and associated gaze behaviour were affected by varying viewpoints. Built upon previous observation of emotion categorization at different viewing angles and gaze behaviour in processing expressive faces, we hypothesised there would be (1) an evident impact of viewpoint on expression judgement, at least on the perceived expression intensity; (2) an holistic face-viewing gaze distribution at all internal facial features with viewpoint quantitatively modifying the amount of fixations at each feature.
Section snippets
Materials and methods
Thirty-two undergraduate students (11 male, 21 female), age ranging from 18 to 43 years old with the mean of 19.94 ± 4.34 (Mean ± SD), volunteered to participate in this study. All participants had normal or corrected-to-normal visual acuity. The Ethical Committee in School of Psychology, University of Lincoln approved this study. Written informed consent was obtained from each participant, and all procedures complied with the British Psychological Society Code of Ethics and Conduct and with the
Analysis of behavioural responses in expression perception
Given humans often show different perceptual sensitivities in recognizing different facial expressions (e.g., Guo, 2012, Guo, 2013, Kirouac and Doré, 1985, Palermo and Coltheart, 2004), it is plausible that viewpoint may show different degrees of impact on perceiving different expressions. This possibility was examined by conducting 3 (viewpoint) × 6 (expression type) repeated-measures analyses of variance (ANOVAs) with expression categorization accuracy, perceived expression intensity and
Discussion
In this study we systematically compared observers' behavioural performance and associated gaze patterns in perceiving common facial expressions across different viewing angles. Our analysis revealed a significant effect of viewpoint on the perceived facial expression intensity. The individual facial expression in profile view was rated significantly less intense than in frontal view, even for those expressions with indistinguishable recognition rates between frontal and profile views (e.g.,
References (46)
- et al.
Looking at faces from different angles: Europeans fixate different features in Asian and Caucasian faces
Vision Research
(2014) - et al.
Humans and macaques employ similar face-processing strategies
Current Biology
(2009) - et al.
Recognition of unfamiliar faces
Trends in Cognitive Sciences
(2000) - et al.
Cultural confusions show that facial expressions are not universal
Current Biology
(2009) - et al.
Size invariant but viewpoint dependent representation of faces
Vision Research
(2006) - et al.
Reassessing the 3/4 view effect in face recognition
Cognition
(2002) - et al.
Anchoring gaze when categorizing faces' sex: Evidence from eye-tracking data
Vision Research
(2009) - et al.
The amygdala processes the emotion significance of facial expressions: An fMRI investigation using the interaction between expression and face direction
NeuroImage
(2004) - et al.
Pushing the boundaries of human expertise in face perception: Emotion expression identification and error as a function of presentation angle, presentation time, and emotion
Journal of Experimental Social Psychology
(2014) - et al.
Eye-movement based memory effect: A reprocessing effect in face perception
Journal of Experimental Psychology: Learning, Memory, and Cognition
(1999)
The face in the crowd effect unconfounded: Happy faces, not angry faces, are more efficiently detected in single and multiple-target visual search tasks
Journal of Experimental Psychology. General
Viewpoint and center of gravity affect eye movements to human faces
Journal of Vision
Face perception
Detection of emotional faces: Salient physical features guide effective visual search
Journal of Experimental Psychology: General
Three-dimensional information in face recognition: An eye-tracking study
Journal of Vision
The resolution of facial expressions of emotion
Journal of Vision
Happy mouth and sad eyes: Scanning emotional facial expressions
Emotion
Pictures of facial affect
What the face reveals: Basic and applied studies of spontaneous expression using the facial action coding system (FACS)
Categorical perception for emotional faces
Emotion Review
Influences of categorization on perceptual discrimination
Journal of Experimental Psychology: General
Initial fixation placement in face images is driven by top-down guidance
Experimental Brain Research
Holistic gaze strategy to categorize facial expression of varying intensities
PLoS One
Cited by (37)
Perception of aesthetic features after surgical treatment of craniofacial malformations by observers of the same age: An eye-tracking study
2023, Journal of Cranio-Maxillofacial SurgeryFacial visual attention to menton deviation: An objective evaluation by laypeople
2022, Journal of Stomatology, Oral and Maxillofacial SurgeryQuantitating the art and science of esthetic clinical success
2021, Journal of the World Federation of OrthodontistsContribution of the mandible position to the facial profile perception of a female facial profile: An eye-tracking study
2019, American Journal of Orthodontics and Dentofacial OrthopedicsCitation Excerpt :Some investigators have suggested a geometric theory as a determinant of the first fixation, which is most often directed toward the geometric center of the faces,15,36 and is consistent with the results of our study. Other researchers have shown that the first fixation always existed in a distinct facial feature, which may sometimes be located in the place of the geometric center, indicating that the main effect is the facial feature rather than the position.14,16 The remaining investigators have attributed the differences to the task that the viewer received.32
Gaze patterns in viewing static and dynamic body expressions
2019, Acta PsychologicaCitation Excerpt :Forty-one young adults (17 males, age = 20.1 ± 0.24 (mean ± SEM) years; 24 females, age = 19.9 ± 0.22 years) were recruited via the Subject Pool of the School of Psychology at the University of Lincoln. This sample size was based on previous research in the same field and was comparable to those published reports (e.g., Guo, 2012; Pollux, Hall, & Guo, 2014; Guo & Shaw, 2015). The suitability of the sample size was confirmed by power analysis using G*power software (Faul, Erdfelder, Lang, & Buchner, 2009).
Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion
2019, Vision ResearchCitation Excerpt :Considering that judging expression category and intensity would need qualitative and quantitative analysis of expressive facial cues from key internal facial features (e.g., shapes of eyes, nose and mouth) respectively (Calvo & Nummenmaa, 2008), image resolution and image blur seems to have the same impact on qualitative and quantitative assessment of local facial regions. Interestingly, change of face viewing perspective from frontal to profile view had little overall impact on expression identification rates but significantly reduced intensity ratings (Guo & Shaw, 2015). It seems that the judgement of expression category and its intensity could be dissociated based on different viewing conditions, and our invariant facial expression perception is likely to be a categorical perception, similar as invariant face identity, race and gender judgement (Bindemann, Scheepers, & Burton, 2009; Brielmann, Bülthof, & Armann, 2014; Hancock, Bruce, & Burton, 2000; Liu & Chaudhuri, 2002; Sæther, Van Belle, Laeng, Brennen, & Øvervoll, 2009).