Elsevier

Neuropsychologia

Volume 51, Issue 5, April 2013, Pages 1002-1010
Neuropsychologia

Multilevel alterations in the processing of audio–visual emotion expressions in autism spectrum disorders

https://doi.org/10.1016/j.neuropsychologia.2013.02.009Get rights and content

Abstract

The abilities to recognize and integrate emotions from another person's facial and vocal expressions are fundamental cognitive skills involved in the effective regulation of social interactions. Deficits in such abilities have been suggested as a possible source for certain atypical social behaviors manifested by persons with autism spectrum disorders (ASD). In the present study, we assessed the recognition and integration of emotional expressions in ASD using a validated set of ecological stimuli comprised of dynamic visual and auditory (non-verbal) vocal clips. Autistic participants and typically developing controls (TD) were asked to discriminate between clips depicting expressions of disgust and fear presented either visually, auditorily or audio-visually. The group of autistic participants was less efficient to discriminate emotional expressions across all conditions (unimodal and bimodal). Moreover, they necessitated a higher signal-to-noise ratio for the discrimination of visual or auditory presentations of disgust versus fear expressions. These results suggest an altered sensitivity to emotion expressions in this population that is not modality-specific. In addition, the group of autistic participants benefited from exposure to bimodal information to a lesser extent than did the TD group, indicative of a decreased multisensory gain in this population. These results are the first to compellingly demonstrate joint alterations for both the perception and the integration of multisensory emotion expressions in ASD.

Highlights

► We assessed the recognition and integration of emotions in individuals with ASD. ► ASD discriminated emotional expressions less efficiently than controls. ► ASD needed a higher signal/noise ratio for the discrimination of emotions. ► ASD presented an alteration in the audiovisual integration of emotional information.

Introduction

The ability to recognize emotional expressions is a fundamental cognitive ability for the regulation of interpersonal interactions (Adolphs, 2002, Custrini and Feldman, 1989, Izard et al., 2001). The tone of the voice and the facial expression are two crucial cues that we constantly use to predict others' actions and to react appropriately in a social situation. An important aspect of affect perception in everyday life is that it usually involves, like speech, the activation of several sensory channels simultaneously. Therefore, the combination of information from facial expression (visual signal) and prosody (auditory signal) usually results in a unified and more optimal representation of the expressed emotion (de Gelder et al., 1999, de Gelder and Vroomen, 2000, de Gelder et al., 2005). For example, it has been shown that the multisensory integration (MSI) of these two types of information typically allows for faster and more accurate recognition of emotion expressions in human observers (Collignon et al., 2008, Collignon et al., 2010, de Gelder and Vroomen, 2000, Dolan et al., 2001, Kreifelts et al., 2007, Massaro and Egan, 1996) and in human-machine interfaces (Busso et al., 2004).

Deficits in the perception of emotion expressions have been suggested as possible causes of atypical social and communicative interactions that are a striking part of the behavioral phenotype of autistic spectrum disorders (ASD) (Bachevalier and Loveland, 2006, Monk et al., 2010, Sigman et al., 2004). However, a majority of the empirical investigations in the field have focused on the facial expression of emotions using static stimuli such as photographs (Bal et al., 2010), with only a few studies using video representing dynamic facial movements (Golan et al., 2008, Loveland et al., 2008, Loveland et al., 1997). The former static stimuli have limited ecological validity and neglect the intrinsic dynamic nature of facial expressions. Indeed, facial movements have been shown to enrich emotional expression, contributing to its identification and playing an important role in the perception of its intensity (Ambadar et al., 2005, Biele and Grabowska, 2006). Also, neuroimaging studies have shown that the brain regions involved in the processing of facial affect, such as the posterior superior temporal sulcus (pSTS), the amygdala and the insula, respond differently to dynamic than to static emotional expressions (Haxby et al., 2000, Haxby et al., 2002, LaBar et al., 2003, Miki et al., 2011). Moreover, only a few studies explored the processing of affective vocalizations in ASD (Baker et al., 2010, Hall et al., 2003, Loveland et al., 2008, Wang et al., 2007). In most cases, these studies included semantic or lexical confounds in the tasks (Lindner & Rosen, 2006) raising the possibility that the results were influenced by differences in language comprehension (Haviland et al., 1996, Paul et al., 2005). Finally, most studies investigating the recognition of emotions in autistic individuals explored a single sensory modality at a time, whereas in natural settings, emotions are expressed both facially and vocally, allowing the combination of these sources of information by human observers for optimal recognition (Collignon et al., 2008, de Gelder et al., 1999, de Gelder and Vroomen, 2000, de Gelder et al., 2005). The use of multisensory conditions to explore the recognition of emotional expressions in ASD is of particular interest since differences in multisensory processing between ASD and typically developing controls (TD) has recently been demonstrated (Collignon et al., 2012, Magnee et al., 2007; 2008; Russo et al., 2010, Russo et al., 2012).

An additional challenge associated with the processing of emotional expressions in natural settings is related to the fact that the saliency of emotional information in faces and voices is often reduced by environmental noise. In signal processing, noise can be considered unwanted data that is not being used to transmit a signal, but is simply a by-product of other activities. For example, the voice of an individual can be masked by noise from other human voices or from objects surrounding him. Similarly, a person's facial expression can be partially hidden by an object or because of the angle in which the observer is positioned. Therefore, the ability of the observer to extract efficiently emotional information from noise appears crucial for effective social interactions and therefore it is relevant to evaluate the perception of emotional expressions in noisy situations (Pelli & Farell, 1999). Some studies have suggested that ASD have a specific difficulty in perceiving speech when presented in a noisy background compared to TD (Alcantara et al., 2004, Smith and Bennetto, 2007). To our knowledge, no study has investigated the perception of visual or auditory emotional expressions in noise in ASD.

The goal of the present study was therefore to explore the perception and the integration of emotion expressions in ASD by using ecological and validated sets of dynamic visual and non-verbal vocal clips of emotional expressions (Belin et al., 2008, Simon et al., 2008). Participants were asked to categorize expressions of fear or disgust as quickly and accurately as possible when presented with auditory, visual and audio–visual stimuli. This task allowed us to compare recognition and MSI performance of emotional expressions between ASD and TD. We also compared unisensory performance of ASD and TD participants by measuring their ability to discriminate emotional expressions when presented auditorily and visually in individually adapted levels of noise. Similar paradigms have been previously successfully used to demonstrate that the perception of emotional expressions is a robust multisensory situation which follows rules that have been observed in other perceptual domains (Collignon et al., 2008) and to illustrate gender differences in the processing of emotion expressions (Collignon et al., 2010).

Section snippets

Subjects

Thirty-two autistic participants (30 males; mean age 21 years±6; range 14–32 years) and 18 TD controls (18 males; mean age 21 years±4; range 15–27 years) participated in this study. Participants were recruited from the database of the Rivière-des-Prairies Hospital's autism clinic (Montreal, Canada). ASD participants were defined using DSM-IV-TR diagnostic criteria, as operationalized by the Autism Diagnostic Interview – Revised (ADI-R) (Lord, Rutter, & Le Couteur, 1994) and the Autistic

Results

In all the analyses presented in the main manuscript, data obtained for fear and disgust stimuli are collapsed. Results (and related statistics) obtained for each emotion separately are presented in Supplementary material (SFigs. 6–8).

Discussion

Alterations in the ability to recognize emotional expressions in ASD is often suggested as a possible source for certain atypical social and communicative behaviors that characterize this population. The first aim of this study was to empirically test this hypothesis by exploring the perception of emotion in autistic individuals using ecological and validated sets of dynamic visual and non-verbal vocal clips of emotional expressions. We found a decreased performance in ASD compared to TD for

Acknowledgments

This research was supported in part by the Canada Research Chair Program (ML, FL), the Canadian Institutes of Health Research (AB, ML, FL, GC), the Natural Sciences and Engineering Research Council of Canada (ML, FL) and the research centre of the University Hospital Sainte-Justine (OC). The authors would like to thank Patricia Jelenic for her help with participant recruitment and selection.

References (136)

  • S. Fecteau et al.

    Amygdala responses to nonlinguistic emotional vocalizations

    Neuroimage

    (2007)
  • J. Grezes et al.

    A failure to grasp the affective meaning of actions in autism spectrum disorder subjects

    Neuropsychologia

    (2009)
  • N. Hadjikhani et al.

    Activation of the fusiform gyrus when individuals with autism spectrum disorder view faces

    Neuroimage

    (2004)
  • J.V. Haxby et al.

    The distributed human neural system for face perception

    Trends in Cognitive Sciences

    (2000)
  • J.V. Haxby et al.

    Human neural systems for face recognition and social communication

    Biological Psychiatry

    (2002)
  • C.R. Jones et al.

    Auditory discrimination and auditory sensory behaviors in autism spectrum disorders

    Neuropsychologia

    (2009)
  • B. Kreifelts et al.

    Audiovisual integration of emotional signals in voice and face: An event-related fMRI study

    NeuroImage

    (2007)
  • H. Lang et al.

    Amygdaloid after discharge and galvanic skin response

    Electroencephalography and Clinical Neurophysiology

    (1964)
  • J. LeDoux

    The amygdala

    Current Biology

    (2007)
  • Y. Liu et al.

    Autonomy of lower-level perception from global processing in autism: Evidence from brain activation and functional connectivity

    Neuropsychologia

    (2011)
  • C.A. Mangina et al.

    Direct electrical stimulation of specific human brain structures and bilateral electrodermal activity

    International Journal of Psychophysiology

    (1996)
  • K. Miki et al.

    Effects of inverting contour and features on processing for static and dynamic face perception: An MEG study

    Brain Research

    (2011)
  • J. Miller

    Divided attention evidence for coactivation with redundant signals

    Cognitive Psychology

    (1982)
  • K. O'Connor et al.

    The neurophysiological correlates of face processing in adults and children with Asperger's syndrome

    Brain and Cognition

    (2005)
  • T. Otto et al.

    Noise and correlations in parallel perceptual decision making

    Current Biology

    (2012)
  • E. Pellicano et al.

    When the world becomes “too real”: A Bayesian explanation of autistic perception

    Trends in Cognitive Sciences

    (2012)
  • E. Pellicano et al.

    Abnormal global processing along the dorsal visual pathway in autism: A possible mechanism for weak visuospatial coherence?

    Neuropsychologia

    (2005)
  • R. Adolphs

    Recognizing emotion from facial expressions: Psychological and neurological mechanisms

    Behavioral and Cognitive Neuroscience Reviews

    (2002)
  • J.I. Alcantara et al.

    Speech-in-noise perception in high-functioning individuals with autism or Asperger's syndrome

    The Journal of Child Psychology and Psychiatry

    (2004)
  • Z. Ambadar et al.

    Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions

    Psychological Science

    (2005)
  • K.F. Baker et al.

    Brief report: Perception and lateralization of spoken emotion by youths with high-functioning forms of autism

    Journal of Autism and Developmental Disorders

    (2010)
  • E. Bal et al.

    Emotion recognition in children with autism spectrum disorders: Relations to eye gaze and autonomic state

    Journal of Autism and Developmental Disorders

    (2010)
  • E. Brochu-Barbeau et al.

    The level and nature of autistic intelligence III: Inspection time

    Journal of Abnormal Psychology

    (2013)
  • S. Baron-Cohen et al.

    Social intelligence in the normal and autistic brain: an fMRI study

    European Journal of Neuroscience

    (1999)
  • P. Belin et al.

    The montreal affective voices: A validated set of nonverbal affect bursts for research on auditory affective processing

    Behavior Research Methods

    (2008)
  • M.K. Belmonte et al.

    Autism and abnormal development of brain connectivity

    The Journal of Neuroscience

    (2004)
  • A. Bertone et al.

    Enhanced and diminished visuo-spatial information processing in autism depends on stimulus complexity

    Brain

    (2005)
  • C. Biele et al.

    Sex differences in perception of emotion intensity in dynamic and static facial expressions

    Experimental Brain Research

    (2006)
  • D.H. Brainard

    The Psychophysics Toolbox

    Spatial Vision

    (1997)
  • A.B. Brandwein et al.

    The development of multisensory integration in high-functioning autism: high-density electrical mapping and psychophysical measures reveal impairments in the processing of audiovisual inputs

    Cerebral Cortex

    (2012)
  • M. Braverman et al.

    Affect comprehension in children with pervasive developmental disorders

    Journal of Autism and Developmental Disorders

    (1989)
  • J. Brock et al.

    The temporal binding deficit hypothesis of autism

    Development and Psychopathology

    (2002)
  • Busso, C., Deng, Z., Yildirim, S., Bulut, M., Lee, C. M., Kazemzadeh, A., et al., 2004. Analysis of emotion recognition...
  • A.J. Calder et al.

    Neuropsychology of fear and loathing

    Nature Reviews Neuroscience

    (2001)
  • M.J. Caron et al.

    Cognitive mechanisms, specificity and neural underpinnings of visuospatial peaks in autism

    Brain

    (2006)
  • G. Celani et al.

    The understanding of the emotional meaning of facial expressions in people with autism

    Journal of Autism and Developmental Disorders

    (1999)
  • V.L. Cherkassky et al.

    Functional connectivity in a baseline resting-state network in autism

    NeuroReport

    (2006)
  • O. Collignon et al.

    Reduced multisensory facilitation in persons with autism

    Cortex

    (2012)
  • H.D. Critchley et al.

    The functional neuroanatomy of social behaviour: Changes in cerebral blood flow when people with autistic disorder process facial expressions

    Brain

    (2000)
  • R.J. Custrini et al.

    Children's social competence and nonverbal encoding and decoding of emotions

    Journal of Clinical Child Psychology

    (1989)
  • Cited by (0)

    View full text