Review
Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence

https://doi.org/10.1016/j.tics.2017.01.001Get rights and content

Trends

Facial expression and perception have long been the primary emphasis in research. However, there is a growing interest in other channels like the voice and touch.

Facial, vocal, and tactile emotion processing have been explored with a range of techniques including behavioral judgments, electroencephalography/event-related potentials, fMRI contrast studies, and multivoxel pattern analyses.

Results point to similarities (e.g., increased responses to social and emotional signals) as well as differences (e.g., differentiation of individual emotions versus encoding of affect) between communication channels.

Channel similarities and differences enable holistic emotion recognition, a process that depends on multisensory integration during early, perceptual and later, conceptual stages.

Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly nonoverlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments.

Section snippets

Nonverbal Emotions: Moving from a Unimodal to a Multimodal Perspective

Emotion (see Glossary) perception plays a ubiquitous role in human interactions and is hence of interest to a range of disciplines including psychology, psychiatry, and social neuroscience. However, its study has been dominated by facial emotions, with other modalities explored less frequently and often anchored within a framework derived from what we know about vision. Here we take a step back and put facial emotions on a par with vocal and tactile emotions. Like a frightened face, a shaking

Sensory Modalities for Emotion Expression

There is vigorous debate about what exactly individuals can express nonverbally. There are open questions about whether people are in an objective sense ‘expressing emotions’ as opposed to engaging in more strategic social communication, whether they express discrete emotions and which ones, and whether their expressions are culturally universal. For practical reasons we ignore these debates here and simply assume that people do express and perceive what we usually call emotions (Box 1).

Facial

Neural Systems for Perceiving Emotions

The neural systems underpinning the perception of emotions have been studied with a range of tasks and techniques. The following review emphasizes approaches that were used frequently and consistently across modalities and hence support a systematic comparison between face, voice, and touch. With respect to tasks, those contrasting nonverbal expressions with control stimuli (e.g., face–house) as well emotional with neutral expressions were selected as most relevant in our review. With respect

Modality Similarities and Differences

To date, research on emotion perception has emphasized the face. Moreover, insights from the face have served as a general guide in the search for and establishment of analogies with other modalities 72, 73 – and indeed, such analogies exist. Vision, audition, and touch each have slow and fast processing pathways that project to specialized regions of the sensory cortex, with upstream regions generally responding more robustly to social over nonsocial signals, especially if they are emotional (

Concluding Remarks

Social interactions elicit spontaneous and strategic expressions that we commonly classify as emotions and that profoundly influence a perceiver’s mental state and ongoing behavior 13, 57, 71. Neuroscience investigations have been uneven, with a historical overemphasis on facial signals. Although many questions remain (see Outstanding Questions), enough evidence has been accumulated to characterize mechanisms that are both unique and shared across modalities. Furthermore, although each channel

Glossary

Affect
often assumed to be a precursor to or prerequisite for a more differentiated emotion, affect is typically described as two dimensional, varying in valence (pleasant to unpleasant) and arousal (relaxed to excited). While applied primarily to the structure of feelings, it can also be applied to behavioral consequences of emotion states (e.g., approach–withdrawal).
Amygdala
an almond-shaped nucleus situated in the anterior aspect of the medial temporal lobe. It comprises several subnuclei and

References (111)

  • N. Kraus et al.

    Unraveling the biology of auditory learning: a cognitive–sensorimotor–reward framework

    Trends Cogn. Sci.

    (2015)
  • C.R. Pernet

    The human voice areas: spatial organization and inter-individual variability in temporal and extra-temporal cortices

    Neuroimage

    (2015)
  • A. Schirmer

    On the spatial organization of sound processing in the human temporal lobe: a meta-analysis

    Neuroimage

    (2012)
  • C. Brück

    Impact of personality on the cerebral processing of emotional prosody

    Neuroimage

    (2011)
  • T. Ethofer

    Decoding of emotional information in voice-sensitive cortices

    Curr. Biol.

    (2009)
  • X. Jiang et al.

    On how the brain decodes vocal cues about speaker confidence

    Cortex

    (2015)
  • I. Croy

    Interpersonal stroking touch is targeted to C tactile afferent activation

    Behav. Brain Res.

    (2016)
  • P. Belin

    Thinking the voice: neural correlates of voice perception

    Trends Cogn. Sci.

    (2004)
  • A. Schirmer

    The socio-temporal brain: connecting people in time

    Trends Cogn. Sci.

    (2016)
  • B. Kreifelts

    Audiovisual integration of emotional signals in voice and face: an event-related fMRI study

    Neuroimage

    (2007)
  • A. Schirmer et al.

    Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing

    Trends Cogn. Sci.

    (2006)
  • R.L.C. Mitchell et al.

    The overlapping relationship between emotion perception and theory of mind

    Neuropsychologia

    (2015)
  • D.Y.-J. Yang

    An integrative neural model of social perception, action observation, and theory of mind

    Neurosci. Biobehav. Rev.

    (2015)
  • C.-Y. Tse

    The functional role of the frontal cortex in pre-attentive auditory change detection

    Neuroimage

    (2013)
  • R.M. Müri

    Cortical control of facial expression

    J. Comp. Neurol.

    (2016)
  • P. Ekman et al.

    Measuring facial movement

    Environ. Psychol. Nonverbal Behav.

    (1976)
  • R.E. Jack

    Facial expressions of emotion are not culturally universal

    Proc. Natl. Acad. Sci. U. S. A.

    (2012)
  • M. Gendron

    Perceptions of emotion from facial expressions are not culturally universal: evidence from a remote culture

    Emotion

    (2014)
  • M. Gendron

    Cultural relativity in perceiving emotion from vocalizations

    Psychol. Sci.

    (2014)
  • V. Enea et al.

    Processing emotional body expressions: state-of-the-art

    Soc. Neurosci.

    (2016)
  • R. Banse et al.

    Acoustic profiles in vocal emotion expression

    J. Pers. Soc. Psychol.

    (1996)
  • T. Bänziger

    Path models of vocal emotion communication

    PLoS One

    (2015)
  • P. Laukka

    A dimensional approach to vocal expression of emotion

    Cogn. Emot.

    (2005)
  • P. Birkholz

    The contribution of phonation type to the perception of vocal emotions in German: an articulatory synthesis study

    J. Acoust. Soc. Am.

    (2015)
  • D.T. Cordaro

    The voice conveys emotion in ten globalized cultures and one remote village in Bhutan

    Emotion

    (2016)
  • J. Brauer

    Frequency of maternal touch predicts resting activity and connectivity of the developing social brain

    Cereb. Cortex

    (2016)
  • R. Ackerley

    Human C-tactile afferents are tuned to the temperature of a skin-stroking caress

    J. Neurosci.

    (2014)
  • L.S. Löken

    Coding of pleasant touch by unmyelinated afferents in humans

    Nat. Neurosci.

    (2009)
  • M.J. Hertenstein

    The communication of emotion via touch

    Emotion

    (2009)
  • D.Y. Tsao et al.

    Mechanisms of face perception

    Annu. Rev. Neurosci.

    (2008)
  • N. Kanwisher

    The fusiform face area: a module in human extrastriate cortex specialized for face perception

    J. Neurosci.

    (1997)
  • R. Adolphs

    A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping

    J. Neurosci.

    (2000)
  • S. Wang

    Neurons in the human amygdala selective for perceived emotion

    Proc. Natl Acad. Sci. U. S. A.

    (2014)
  • B.R. Innes

    A leftward bias however you look at it: revisiting the emotional chimeric face task as a tool for measuring emotion lateralization

    Laterality

    (2015)
  • M.V. Peelen

    Supramodal representations of perceived emotions in the human brain

    J. Neurosci.

    (2010)
  • P.A. Kragel et al.

    Multivariate neural biomarkers of emotional states are categorically distinct

    Soc. Cogn. Affect. Neurosci.

    (2015)
  • R. Adolphs

    What does the amygdala contribute to social cognition?

    Ann. N. Y. Acad. Sci.

    (2010)
  • A.J. Calder

    Neuropsychology of fear and loathing

    Nat. Rev. Neurosci.

    (2001)
  • S. Bentin

    Electrophysiological studies of face perception in humans

    J. Cogn. Neurosci.

    (1996)
  • F. Suess

    Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge

    Soc. Cogn. Affect. Neurosci.

    (2015)
  • Cited by (227)

    View all citing articles on Scopus
    View full text