Elsevier

Neuropsychologia

Volume 64, November 2014, Pages 105-123
Neuropsychologia

Invited Review
The construct of the multisensory temporal binding window and its dysregulation in developmental disabilities

https://doi.org/10.1016/j.neuropsychologia.2014.08.005Get rights and content

Highlights

  • The review focuses on altered multisensory function in developmental disabilities.

  • Multisensory temporal acuity is altered in autism, dyslexia and schizophrenia.

  • The construct of the multisensory temporal binding window is critical in perception.

  • Perceptual training may have utility in improving multisensory function.

Abstract

Behavior, perception and cognition are strongly shaped by the synthesis of information across the different sensory modalities. Such multisensory integration often results in performance and perceptual benefits that reflect the additional information conferred by having cues from multiple senses providing redundant or complementary information. The spatial and temporal relationships of these cues provide powerful statistical information about how these cues should be integrated or “bound” in order to create a unified perceptual representation. Much recent work has examined the temporal factors that are integral in multisensory processing, with many focused on the construct of the multisensory temporal binding window – the epoch of time within which stimuli from different modalities is likely to be integrated and perceptually bound. Emerging evidence suggests that this temporal window is altered in a series of neurodevelopmental disorders, including autism, dyslexia and schizophrenia. In addition to their role in sensory processing, these deficits in multisensory temporal function may play an important role in the perceptual and cognitive weaknesses that characterize these clinical disorders. Within this context, focus on improving the acuity of multisensory temporal function may have important implications for the amelioration of the “higher-order” deficits that serve as the defining features of these disorders.

Introduction

We live in a world rich with information about the events and objects around us. This information comes in a variety of different forms; forms that we generally ascribe to our different senses. Although neuroscience has generally approached the study of sensory processes on a modality-by-modality basis, our perceptual view of the world is an integrated and holistic one in which these sensory cues are blended seamlessly into a singular perceptual Gestalt. Such a multisensory perspective cries out for an intensive investigation of how information from the different senses is combined by the brain to influence our behaviors and shape our perceptions, a field that has emerged over the past 25 years and which is now growing at an impressive pace.

Rather than simply acknowledging the necessity of merging information from the different senses in order to build our perceptual reality, it must also be pointed out that the synthesis of multisensory information confers powerful behavioral and perceptual advantages (for recent reviews see Calvert et al., 2004, King and Calvert, 2001, Stein and Meredith, 1993, Stein et al., 2002a). Indeed, the driving evolutionary forces that undoubtedly led to multisensory systems are the powerful adaptive benefits seen when information is available from more than a single sense. For example, in animal behavior, the presence of cues from multiple senses has been shown to result in improvements in stimulus detection, discrimination and localization that manifest as faster and more accurate responses. In a similar manner, human studies have revealed multisensory-mediated performance benefits in a host of behavioral and perceptual tasks. Several of the more salient of these include the speeding of simple reaction times under paired visual–auditory stimulation and increased intelligibility of a speech signal when presented in a multisensory (i.e., audiovisual) context within a noisy environment (Bishop and Miller, 2009, Erber, 1975, Girin et al., 2001, Grant and Walden, 1996, Grant et al., 1998, MacLeod and Summerfield, 1987, Robert-Ribes et al., 1998, Stevenson and James, 2009, Sumby and Pollack, 1954).

A great deal of work has gone into examining the neural correlates of these multisensory-mediated changes in behavior and perception. These studies have detailed the presence and organization of a number of cortical and subcortical structures within which information from multiple senses converges, and the neural integration that accompanies this convergence in both humans (Amedi et al., 2005, Baum et al., 2012, Bishop and Miller, 2009, Beauchamp et al., 2004, Beauchamp, 2005, Calvert et al., 1999, Calvert et al., 2001, Calvert et al., 2000, Calvert, 2001, Cappe et al., 2009, Cappe et al., 2010, De Gelder et al., 2004, Foxe et al., 2000, Foxe et al., 2002, James et al., 2011, Kim and James, 2010, Kim et al., 2012, Laurienti et al., 2002, Laurienti et al., 2003, Lloyd et al., 2003, Macaluso et al., 2004, Martuzzi et al., 2007, Molholm et al., 2002, Murray et al., 2005, Nath and Beauchamp, 2011, Nath and Beauchamp, 2012, O’Doherty et al., 2004, Powers et al., 2012, Romei et al., 2009, Stevenson et al., 2007, Stevenson and James, 2009, Stevenson et al., 2009, Stevenson et al., 2010, Stevenson et al., 2011, Stevenson et al., 2012, Stevenson et al., 2012a, Wallace and Murray, 2011, Werner and Noppeney, 2009, Werner and Noppeney, 2010a, Werner and Noppeney, 2010b, Werner and Noppeney, 2011) and animals (Allman and Meredith, 2007, Allman et al., 2008, Alvarado et al., 2007, Alvarado et al., 2008, Alvarado et al., 2009, Benevento et al., 1977, Carriere et al., 2007, Jiang et al., 2001, Kadunce et al., 1997, Meredith and Stein, 1986b, Meredith and Stein, 1983, Meredith et al., 1992, Meredith, 2002, Perrault et al., 2005, Royal et al., 2009, Schroeder et al., 2001, Schroeder and Foxe, 2002, Stein and Meredith, 1990, Stein and Meredith, 1993, Stein et al., 1993, Stein and Wallace, 1996, Stein, 1998, Stein et al., 2002b, Stein et al., 2009, Wallace et al., 1992, Wallace et al., 1993, Wallace et al., 1998, Wallace and Stein, 1994, Wallace et al., 1996, Wallace et al., 2006, Wallace and Murray, 2011). In addition, a great deal of recent work has gone into describing the modulatory influences that a “non-dominant” modality can have on information processing within the “dominant” modality, such as examining how visual information can affect the processing of sounds within auditory cortex (Hackett and Schroeder, 2009, Zion Golumbic et al., 2013). Indeed, these observations have spurred a debate as to whether or not the entire cerebral cortex (and by extension the entire brain) can be considered “multisensory” (Driver and Noesselt, 2008, Schroeder and Foxe, 2005). Collectively, these studies have greatly illuminated our understanding of how information from the different senses interacts to influence neural and network responses, and how these responses are ultimately correlated with behavior and perception.

Along with detailing how neuronal, behavioral and perceptual responses are altered under multisensory conditions, prior work has also revealed key operational characteristics regarding these multisensory interactions. Perhaps most important among these was the general finding that the physical characteristics of the stimuli that were to be combined are important determinants of the end product of a multisensory interaction. First studied at the level of the individual neuron, these stimulus factors include the characteristics of space, time and effectiveness. In regards to space and time, multisensory (e.g., visual–auditory) stimuli that are spatially and temporally proximate typically result in the largest enhancements in neuronal response (Meredith et al., 1987, Meredith et al., 1992, Meredith and Stein, 1986a, Meredith and Stein, 1996, Royal et al., 2009, Wallace et al., 1996, Wallace et al., 2004). In addition, stimuli that are weakly effective when presented on their own result in proportionately larger enhancements when combined (James et al., 2012, Meredith and Stein, 1983, Stein et al., 2009). These basic integrative principles make a great deal of intuitive sense in that space and time are powerful statistical indicators of the likelihood that stimuli arise from the same event, and in that a highly-salient or effective stimulus is one modality needs little amplification. Recent work has added to our understanding of the role that these stimulus factors play in multisensory interactions by highlighting their interdependency (Carriere et al., 2008, Ghose and Wallace, 2014, Krueger et al., 2009, Royal et al., 2009, Sarko et al., 2012, Sarko et al., 2013). Thus, one cannot view space, time and effectiveness as independent entities, since manipulations of one, for example spatial location, will also impact the effectiveness of those stimuli and the temporal firing patterns associated with them.

Following the description of these principles at the neuronal level, a number of studies have followed up on this work in the behavioral and perceptual realms, and have shown that these principles often extend into these domains as well. Thus, behavioral and perceptual facilitations have been shown to be greatest for stimuli that are close together in space and time (Alais et al., 2010, Bertelson and Radeau, 1981, Colonius and Diederich, 2004, Colonius and Diederich, 2010, Colonius and Diederich, 2011, Colonius et al., 2009, Conrey and Pisoni, 2006, Diederich and Colonius, 2004, Diederich and Colonius, 2007, Diederich and Colonius, 2009, Diederich et al., 2003, Dixon and Spitz, 1980, Frens et al., 1995, Hay-McCutcheon et al., 2005, Hay-McCutcheon et al., 2009, Hillock et al., 2011, Hillock-Dunn and Wallace, 2012a, Keetels and Vroomen, 2005, Keetels and Vroomen, 2007, Keetels and Vroomen, 2008a, Keetels et al., 2007, Kuling et al., 2012, Lewald et al., 2001, Lewkowicz, 1996a, Lewkowicz, 2000b, Massaro et al., 1996, Neil et al., 2006, Powers et al., 2009, van Atteveldt et al., 2007, van Wassenhove et al., 2007, Wallace et al., 2004, Zampini et al., 2005), and the proportional benefits of combining stimuli across different modalities appear to be greatest when the individual stimuli are weakly effective (Diederich and Colonius, 2004, Kim and James, 2010, Kim et al., 2012, Nath and Beauchamp, 2011). In addition, and much like for the neuronal data described above, recent studies have also illustrated the interdependency of these principles in human performance and perception (Macaluso et al., 2004, Royal et al., 2009, Stevenson et al., 2012c).

One area of very active research is the applicability of these principles for describing all aspects of human performance and perception. Although first driven by studies showing exceptions to the spatial, temporal and effectiveness principles described above, more recent thinking is converging toward a more dynamic and contextual view of the applicability of these principles (Doehrmann and Naumer, 2008, van Atteveldt et al., 2014). In addition to illustrating the flexibility inherent in multisensory processes, there are strong suggestions as to the mechanistic underpinnings of such adaptive networks and integrative processes, including oscillatory phase resetting and divisive normalization (van Atteveldt et al., 2014). As a more concrete example, in the context of a task in which temporal factors are relatively unimportant (e.g., stimulus or target localization), it is expected that there would be less (if any) weighting placed on the temporal structure of the stimulus complex. Thus, current thinking invokes a flexibly specified set of interactive rules or principles tightly related to task performance that ultimately dictate the final product of a multisensory stimulus combination.

In the current review, we have chosen to focus on temporal factors, in large measure because of the recent accumulation of evidence that has outlined how multisensory temporal function changes during typical development, and because of the growing acknowledgment that multisensory temporal acuity is altered in a number of neurodevelopmental disabilities – three of which, autism, dyslexia and schizophrenia, are highlighted in this review. Although this review is framed from the perspective of temporal function for these reasons, we must point out that, as alluded to above, both space and spatiotemporal factors are powerful players in the construction of our multisensory perceptual Gestalt. Indeed, much work has focused on describing how these spatial and spatiotemporal factors influence multisensory interactions at the neural, behavioral and perceptual levels (Bell et al., 2001, Diederich et al., 2003, Frens et al., 1995, Harrington and Peck, 1998, Macaluso et al., 2004, Meredith and Stein, 1986c, Meredith and Stein, 1996, Neil et al., 2006, Royal et al., 2009, Sarko et al., 2012, Stevenson et al., 2012b, Teder-Salejarvi et al., 2005a, Van Wanrooij et al., 2009, Wallace et al., 2004, Zampini et al., 2007), and any accounting of multisensory function is necessarily incomplete without acknowledgment of the important role these factors play as “filters” for multisensory systems.

The concept of temporal factors, originally defined on the basis of the temporal tuning functions of individual multisensory neurons (Fig. 1A) (Meredith et al., 1987), has been expanded to capture the effects of temporal factors on human psychophysical performance. Although the temporal properties of human performance very much resemble their neuronal counterparts (Fig. 1B) (Dixon & Spitz, 1980), when placed in the context of a judgment about the unity of an audiovisual stimulus complex (i.e., did they occur at the same time or not), they point to a thresholding process in which the observer must make a probabilistic judgment about the nature of the stimulus complex. More concretely, in the example shown in Fig. 1b, the subject is making judgments about the simultaneity of a visual–auditory stimulus pair that is presented at varying stimulus onset asynchronies (SOAs) or delays. Note that when the stimuli are objectively simultaneous (i.e., an SOA of 0 ms), the subject has a high probability of correctly reporting this simultaneity. However, even with delays of a hundred milliseconds or more, the subject still reports on a high percentage of trials that the stimuli are simultaneous. Such a broad interval within which simultaneity continues to be reported suggests a degree of temporal tolerance for stimulus asynchrony, in essence creating a “window” of time within which multisensory stimuli are highly likely to be perceptually bound or integrated (Conrey and Pisoni, 2004, Conrey and Pisoni, 2006, Diederich and Colonius, 2009, Hairston et al., 2003, Hairston et al., 2005, Stevenson and Wallace, 2013, Stevenson et al., 2014, van Wassenhove et al., 2007, van Eijk et al., 2008).

This construct of a multisensory temporal binding window (TBW) is highly adaptive, in that it allows multisensory information to be bound even when it originates at differing distances from the subject. The biological utility of this is grounded in the substantial differences in the propagation times for visual and auditory energy. Consider a visual–auditory event happening 1 m from you vs. 34 m away. In the first case, the arrival of the visual and auditory energies to the eye and ear is nearly simultaneous, whereas in the second circumstance the auditory information arrives at the ear approximately 100 ms after the visual information impinges on the eye (sound travels at about 340 m/s). Additional evidence for the importance of these biological delays can be seen through measures of the point of subjective simultaneity (PSS), the exact temporal offset (measured at the sensory organ) at which an individual is most likely to perceive two inputs as synchronous. On initial thinking, one would expect the PSS to be at 0. However, the PSS in most individuals is typically observed when the auditory component of a stimulus pair slightly lags the visual stimulus component [for review, see van Eijk et al. (2008)].

In recent years a number of salient characteristics about this TBW have been discovered (Fig. 2). First, the window differs in size for different stimuli, with it being smallest for simple audiovisual stimulus pairs such as flashes and beeps, intermediate in size for more complex environmental stimuli such as a hammer hitting a nail, and largest for the most complex of naturalistic multisensory stimuli – speech (de Boer-Schellekens et al., 2013, Stevenson and Wallace, 2013, Stevenson et al., 2014a, van Eijk et al., 2008, Vatakis and Spence, 2006). Second, the TBW exhibits a marked degree of variability from subject-to-subject (Stevenson, Zemtsov, & Wallace, 2012). Third, the TBW continues to mature late into development, with it being broader than for adults well into adolescence (Hillock et al., 2011, Hillock-Dunn and Wallace, 2012a). Finally, the TBW has been shown to be malleable in multiple ways, both adjusting to the temporal statistics of the environment (recalibration (Fujisaki et al., 2004; Keetels and Vroomen, 2007, Keetels and Vroomen, 2008b; Navarra et al., 2005; Stetson et al., 2006; Vroomen et al., 2004)), and in perceptual plasticity studies showing that it can be substantially narrowed with feedback training (Powers et al., 2009, Powers et al., 2012, Stevenson et al., 2013, Schlesinger et al., 2014).

Collectively, these studies point to the multisensory TBW as an important component of our perceptual view of the world, structured to make strong statistical inferences about the likelihood that multisensory stimuli originate from the same object or event. As highlighted below, individual differences in this window and alterations in its size are likely to have important implications for the construction of our perceptual (and cognitive) representations.

As alluded to earlier, much of the foundational work in regards to the neural correlates of multisensory function, including its temporal constraints, has come from a midbrain structure, the superior colliculus (SC). The primary role of the SC is in the initiation and control of gaze (i.e., combined eye and head) movements to a stimulus of interest. Following the principles described earlier, these movements are facilitated (i.e., are generally faster and more accurate) to multisensory stimuli that are spatially and temporally proximate (Colonius and Diederich, 2004, Diederich et al., 2003, Diederich and Colonius, 2004, Diederich and Colonius, 2008, Frens et al., 1995, Frens and Van Opstal, 1998, Hughes et al., 1994, Hughes et al., 1998, Stein et al., 1989). However, for perceptual judgments such as the evaluations of simultaneity described earlier, it is unlikely that the SC plays a major role. Rather, these perceptual (as opposed to sensorimotor) processes appear to be the dominion of cortical domains likely to play a central role in stimulus “binding.”

One of the central cortical hubs for the processing of audiovisual timing relations appears to be the cortex surrounding the posterior superior temporal sulcus (pSTS). The pSTS is well positioned for this role in that it lay at the junction between occipital (visual) and temporal (auditory) cortex, and it receives substantial convergent input from visual and auditory cortical domains. Moreover, the pSTS is differentially active during the presentation of synchronous versus asynchronous audiovisual stimulus pairs, suggesting an important role in evaluations of audiovisual timing (Macaluso et al., 2004, Miller and D’Esposito, 2005, Powers et al., 2012, Stevenson et al., 2010, Stevenson et al., 2011). The pSTS has also been shown to signal the perceptual binding of an audiovisual stimulus pairing, responding more efficiently to a pairing of identical temporal relations that is perceived as a single event when compared to one that is perceived to be two distinct events (Stevenson et al., 2011). An additional piece of evidence in support of a central role for the pSTS in multisensory temporal function is that following perceptual training that narrows the TBW, activity changes as indexed by fMRI are seen in a cortical network centered on the pSTS (Powers et al., 2012) (Fig. 3). Finally, numerous studies have shown the pSTS to be an important site for the processing of audiovisual speech cues (Baum et al., 2012, Beauchamp et al., 2010, Bishop and Miller, 2009, Nath and Beauchamp, 2011, Nath and Beauchamp, 2012, Nath et al., 2011, Stevenson et al., 2009), including work that has shown that deactivation of the pSTS via transcranial magnetic stimulation (TMS) can abolish the McGurk illusion – in which the pairing of discordant visual and auditory speech tokens typically results in a novel fused percept (Beauchamp et al., 2010). Collectively, these studies point to the pSTS as a key node for multisensory convergence and integration, and for the evaluation of temporal factors in the perceptual determination of stimulus binding.

Somewhat surprisingly, although we know a great deal about the characteristics, function and behavioral/perceptual correlates of multisensory integration in the adult, our knowledge of these processes during development has been less well described. Animal model studies have shown that multisensory neurons and their associated integrative properties mature over a protracted period of developmental life that extends well into “adolescence” (Wallace and Stein, 1997, Wallace et al., 1997, Wallace, 2004a, Wallace, 2004b, Wallace et al., 2006). In addition, these studies have shown remarkable plasticity in the development of these processes, such that changes in the statistical structure (i.e., spatial and temporal stimulus relations) of the early sensory world result in the development of integrative properties that match these statistics (Carriere et al., 2007, Polley et al., 2008, Wallace and Stein, 2000, Wallace et al., 2004).

Much of the work that has examined multisensory function in human development has focused on the period soon after birth. These studies have shown a beautiful sequential development in the abilities of the infant in their ability to evaluate (and likely bind) multisensory relations, with the capacity to evaluate simple features of a multisensory stimulus complex (e.g., duration) maturing prior to the ability to evaluate more complex features (e.g., rhythm) (Lewkowicz and Lickliter, 1994, Lewkowicz and Ghazanfar, 2009, Lewkowicz, 2014). Examples that are most germane to the temporal dimension, the focus of the current review, include the findings that infants begin life with much larger temporal binding windows for both audiovisual non-speech and speech stimuli (with those for speech being longest (Lewkowicz, 1996b, Lewkowicz, 2000a, Lewkowicz, 2010)). In addition, it has been found that the window for speech stimuli does not begin to narrow until around 5 years of age (Lewkowicz & Flom, 2014). More recent work from our group has shown that these developmental processes continue to mature well into older ages. Thus, we have shown that the multisensory TBW remains larger than for adults well into adolescence (Fig. 4) (Hillock et al., 2011, Hillock-Dunn and Wallace, 2012b). Intriguingly, this enlarged window appears to depend on the nature of the stimuli that are being combined. Thus, whereas the window appears larger for the pairing of simple low-level visual and auditory stimuli (i.e., flashes and beeps), it is of normal size in these children for more complex speech-related stimuli.

Although far from providing a comprehensive characterization of how multisensory processes develop in the period leading up to adulthood, these studies have illustrated the long developmental interval over which these processes mature, and the marked plasticity that characterizes the maturation of multisensory function. With this as a backdrop, it should come as little surprise to see that multisensory abilities are frequently altered in the context of developmental disabilities.

As we have seen, the ability of individuals to perceptually bind sensory information allows for significant behavioral benefits and serves to create a coherent and unified perception of the external world. If these processes develop in an atypical manner then, it should come as little surprise that detrimental behavioral, perceptual and cognitive consequences are the result. Here, we will discuss such atypical multisensory function in the context of three developmental disabilities; autism spectrum disorders, dyslexia, and schizophrenia. In each case, we will highlight the current behavioral and perceptual evidence for atypical multisensory temporal processing, describe the evidence for the possible neural correlates of these dysfunctions, and outline areas in which further work is needed.

Autism spectrum disorders (ASD) make up a constellation of neurodevelopmental disabilities characterized by deficits in social communicative skills and by the presence of restricted interests and/or repetitive behaviors. The most recent evidence suggests that the incidence of ASD may be as high as 1 child in 88 (Autism et al., 2012), making it a substantial public health problem with large societal and economic costs. Although initially characterized and diagnosed on the basis of deficits in a “triad” of domains – language and communication, social reciprocity and restricted/repetitive interests – the presence of sensory deficits are now widely acknowledged, warranting their inclusion in the recent revision of the DSM (American Psychiatric Association, 2013).

The challenges in describing and defining sensory dysfunction in the context of ASD have arisen in part because of the enormous heterogeneity in these changes – ranging from striking hyporesponsivity and underreactivity to sensory stimuli to hyperresponsivity and between sensory aversions to sensory-seeking behaviors (American Psychiatric Association, 2013). Despite this phenotypic variability, the fact that upwards of 90% of children with autism have some form of sensory alteration suggest it to be a core component of autism.

One of the great challenges with assessing the nature of these sensory changes has been that the overwhelming majority of the data has come from anecdotal evidence, caregiver reports, or self-report survey measures, limiting the ability to have a comprehensive and empirically grounded picture of the nature of these changes. This is currently changing as a number of studies are beginning to provide a more objective and systematic view into sensory function in autism. This work has served to bolster the more subjective reports, reinforcing the presence of processing deficits in a number of sensory modalities, including vision (Bertone et al., 2005b, Davis et al., 2006, Frey et al., 2013, Greenaway et al., 2013, Kern et al., 2006, Kern et al., 2007, Kroger et al., 2013, Marco et al., 2011, Pellicano et al., 2005, Simmons et al., 2009, Spencer and O’Brien, 2006, Takarae et al., 2008, Tsermentseli et al., 2008, Wainwright-Sharp and Bryson, 1993), audition (Gage et al., 2003, Gandal et al., 2010, Groen et al., 2009, Kujala et al., 2013, Kwakye et al., 2011, Lepisto et al., 2005, Marco et al., 2011, Russo et al., 2008, Russo et al., 2010, Roberts et al., 2010, Szelag et al., 2004, Teder-Salejarvi et al., 2005b, Visser et al., 2013), and touch (Cascio, 2010, Cascio et al., 2012, Foss-Feig et al., 2012).

However, and seemingly at odds with this evidence, a number of these studies have also revealed the presence of normal or even enhanced sensory function in certain children and in certain domains (Almeida et al., 2013, Bertone et al., 2005a, Bonnel et al., 2003, Bonnel et al., 2010, Chamberlain et al., 2013, Chen et al., 2012, Falter et al., 2012, Falter et al., 2013, Jarrold et al., 2005, Joseph et al., 2009, Manjaly et al., 2007, Mottron et al., 2000, O’Riordan and Plaisted, 2001, O’Riordan, 2004, O’Riordan and Passetti, 2006, O’Riordan et al., 2001, Plaisted et al., 1998a, Plaisted et al., 1998b, Samson et al., 2012, Smith and Milne, 2009, Stanutz et al., 2014). Although initially enigmatic, these normal or improved abilities appear to be restricted to tasks that tap into low-level sensory function or require extensive local (as opposed to global) processing, suggesting that early sensory processing and the neural architecture that subserves it may be preserved (or even enhanced) in the autistic brain. This finding fits within the hypothetical framework that in autism local cortical organization and connectivity are preserved, but processes that rely upon communication across brain networks are impaired (see model section below for more detail). As an elegant example of this, Bertone et al. (2005a) found that in a visual grating orientation task in which the gratings were specified by luminance, children with autism outperformed typically developing children. In contrast, when the gratings were specified by changes in texture rather than luminance, the children with autism performed more poorly. Whereas the neural mechanisms for determining orientation from luminance are believed to be in primary visual cortex (V1), the mechanism for deriving orientation from texture are believed to take place at later processing stages within the visual hierarchy. This example highlights evidence in support of but one of the many neurobiologically-inspired models for describing autism and the associated changes in sensory function.

A multitude of brain-based theories of autism have been put forth, each with varying degrees of supporting evidence. Several of the more prominent of these, described briefly in this section, have been used to explain differences in sensory function in ASD (along with the more widely established changes in social communicative function).

The concept of weak central coherence is closely related to the observations described above, in that it suggests that communication across brain networks is preferentially impaired in autism (Frith and Happe, 1994, Happe, 1999, Happe and Frith, 2006). In its simplest form, the concept suggests strong deficits in holistic or “Gestalt” processing, in which individuals with autism have striking difficulties in the processing of global features, but in which the processing of local features is relatively intact or even enhanced. One of the hallmark tests used to differentiate local vs. global processing is the so-called embedded figures test, in which participants are asked to report on the number of simple shapes (e.g., triangles) contained within a larger image (e.g., the drawing of a clock). Numerous studies have shown that individuals with ASD outperform those who are typically-developed, but disagree on the nature of the global deficits seen using this task (Bolte et al., 2007, Jolliffe and Baron-Cohen, 1997, Mottron et al., 2003, Shah and Frith, 1983). In many respects, weak central coherence can be subsumed within ideas of autism as a functional disconnection syndrome or a connectopathy, in which the core deficits are founded in weaknesses in connectivity across brain networks and that have been seen in both structural and functional connectivity studies (Geschwind and Levitt, 2007, Just et al., 2004, Just et al., 2007, Melillo and Leisman, 2009). Although framed at a different level, these changes in network function can also be seen as a result of changes in the excitatory/inhibitory balance, another prevailing model concerning the pathophysiology in autism (Rubenstein & Merzenich, 2003). In this model, the core deficit in autism is the carefully balanced ratio of excitation and inhibition within and across brain networks, which if disrupted can have dramatic effects on network communication and the associated functional correlates. Another emerging model in autism suggests an important role for increases in noise or degraded signal-to-noise ratio in the etiology of autism (Dinstein et al., 2012, Jones et al., 2010, Milne, 2011, Perez Velazquez and Galan, 2013). The presence of increased noise (which could come from a number of sources) would basically degrade the quality of information processing, with increasing effects as one ascends up through an information processing hierarchy and thus taps greater and greater integrative abilities (since the noise would be cumulative). Finally, the temporal binding deficit hypothesis posits that timing-related deficits as a core feature of autism (Brock et al., 2002). Indeed, temporal integration is a core feature for processing within all sensory systems, and disruptions in timing-related circuits could give rise to supramodal or multisensory processing deficits. Although these theories have been espoused by different groups at different times, there are striking similarities among them that suggest marked commonalities and shared mechanistic relations. As just one example, the temporal deficit described above could be the result of alterations in connectivity, excitatory/inhibitory balance and/or noisy sensory and perceptual encoding.

The prevalence of observations highlighting deficits in multiple sensory systems, coupled with evidence that integrative functions across brain networks may be preferentially impacted, has led to an examination of the role that multisensory dysfunction may play in autism (Stevenson et al. (2014b)). Although as highlighted above there is now clear evidence for changes in function within the individual sensory systems, this work is predicated on the view that these unisensory deficits may not completely capture the nature of the changes in processes that index integration across the different sensory systems. In recent years, a number of labs, including our own, have attempted to provide a better view into the nature of these changes in multisensory function in those with autism.

To date, the picture that has been generated by these studies has been a complex and confusing one. Thus, although a number of studies have reported changes in multisensory function that extend beyond those predicted on the basis of changes in unisensory function, others have found either normal multisensory abilities, or deficits that can be completely explained based on unisensory performance. One of the best illustrations of this complexity is in work that has explored the susceptibility of individuals with autism to the McGurk effect – the perceptual illusion that indexes the synthesis of visual and auditory speech signals (McGurk & MacDonald, 1976). Whereas some groups have found weaknesses in this perceptual fusion (Bebko et al., 2013, de Gelder et al., 1991, Irwin et al., 2011, Mongillo et al., 2008, Stevenson et al., 2014c, Taylor et al., 2010, Williams et al., 2004), others have found normal McGurk percepts (Iarocci et al., 2010, Woynaroski et al., 2013) or changes in McGurk reports that are accountable by changes in responsiveness to the visual or auditory speech tokens (Mongillo et al., 2008). The likely explanations for the substantial disparities across studies include differences in the composition of the ASD cohort (with age and severity of symptoms being significant factors) and differences in how the specific tasks are structured. Thus, even for the McGurk effect, different stimuli and response modes have been used to assay the illusion.

Despite this confusion, one of the more robust findings in autism is poorer multisensory temporal acuity – a finding that typically manifests as a broadening of their multisensory TBW (Bebko et al., 2006, de Boer-Schellekens et al., 2013, Foss-Feig et al., 2010, Kwakye et al., 2011, Stevenson et al., 2014a). In addition to their concordance with the general finding of sensory changes in children with autism, these results are also in agreement with a substantial body of evidence pointing to deficits in timing or temporally-based processes in autism. Indeed, these deficits have been encapsulated within one of the neurobiologically-inspired theories of autism described earlier—namely the temporal binding deficit hypothesis (Brock et al., 2002).

Changes in multisensory temporal function in autism have been found using a number of different tasks, including simultaneity judgments (Stevenson et al., 2014a), temporal order judgments (de Boer-Schellekens et al., 2013, Kwakye et al., 2011), the perception of the sound-induced flash illusion (Foss-Feig et al., 2010), and preferential looking tasks (Bebko et al., 2006) (Fig. 5). In each of these studies, the basic finding is that individuals with ASD perceive paired visual–auditory stimuli as originating from the same event over longer time intervals than for control groups (i.e., they report simultaneity even when the stimuli are substantially asynchronous). One interesting, and to date unresolved, difference between these studies is whether the TBW is extended for all types of visual–auditory stimuli, or only for specific stimulus types more closely related to the well-established domains of weakness (e.g., speech). Thus, whereas much work supports differences only for speech-related stimuli (Bebko et al., 2006, Stevenson et al., 2014a), other studies suggest more generalized temporal deficits that extend to pairs of very simple stimuli (i.e., flashes and beeps) (de Boer-Schellekens et al., 2013). Although future work will need to resolve these differences, it is important to point out here that although there are likely to be commonalities in the brain networks supporting multisensory (or at least audiovisual) temporal function, there are also likely to be separate mechanisms governing the integration of low- vs. higher-level audiovisual stimuli. For example, whereas the integration of lower-level flashes and beeps (which can be considered to represent an “arbitrary” pairing) is likely to not involve brain regions interested in contextual or semantic congruence (another important facet of multisensory binding), the integration of higher-level stimuli such as object or speech cues will also entail activation in network components performing such contextual computations.

Why would an enlarged TBW in autism necessarily be a bad thing? One reason is that the temporal fidelity or tuning of multisensory systems decides on which stimuli should be bound and which should not. The binding of stimuli over longer temporal intervals is likely to result in the creation of poor or “fuzzy” multisensory perceptual representations in which there is a great deal of ambiguity about stimulus identity. Typically, the temporal relationship between two sensory inputs is an important cue as to whether those inputs should be bound. When the perception of this temporal relationship is less acute, subjective temporal synchrony loses its reliability as a cue to bind. The end result of losing such a salient cue and important piece of information is that the individual shows weaker binding overall. In support of this idea is recent work from our laboratory and that has illustrated a strong relationship between the multisensory TBW and the strength of perceptual binding (Stevenson et al., 2012, Stevenson et al., 2014a). In this study, the width of the TBW was found to be strongly negatively correlated with perceptual fusions as indexed by the McGurk effect (Fig. 5). This finding lends strong support to the linkage between multisensory temporal function and the creation of perceptual representations, an area of inquiry that we believe will be extremely informative moving forward.

In our view, the importance of the worked cited above extends well beyond the links that have currently been established. Sensory, and by extension multisensory, processes form the building blocks upon which perceptual and cognitive representations are created. These input streams are crucial not only for the “maps” that form the cornerstone of early subcortical and cortical sensory representations, but also for so-called higher-order processes that are dependent on the integrity of the information within the incoming sensory streams. Such a framework predicts that changes in sensory and multisensory processes will have cascading effects upon the information processing hierarchy, ultimately impacting cognitive domains such as attention, executive function, language and communication and social interactions (Bahrick and Lickliter, 2000, Bahrick and Lickliter, 2003, Bahrick, 2010, Bahrick and Todd, 2011, Stevenson et al., 2014a).

Focusing on the social and communicative pieces because of their relationship to autism, it must be acknowledged that both are not only highly dependent upon sensory information, but also are dependent upon the integration of information across the different sensory modalities (Stevenson et al., 2014a). Language and communicative functions are highly multisensory, depending not only upon the auditory channel but also upon the associated visual cues such as articulatory gestures that provide vital information for the comprehension of the speech signal (particularly in noisy environments – see Bishop and Miller, 2009, Erber, 1975, Grant et al., 1998, Grant and Walden, 1996, Girin et al., 2001, MacLeod and Summerfield, 1987, Robert-Ribes et al., 1998, Sumby and Pollack, 1954, Stevenson and James, 2009. In a similar fashion, the interpretation of social cues is keenly dependent upon multisensory processes. Inflections of the voice, facial gestures, and touch convey important social information that must be properly integrated in order to fully understand the content of the social setting.

Although intuitively appealing, much work needs to be done in order to establish these critical links between sensory and multisensory function and higher-order abilities. Indeed, ongoing work in our laboratory is using large-scale correlational matrices in order to identify important associations between a battery of sensory and multisensory tasks that we now routinely use, and a host of measures of cognitive abilities. In an associated manner, we have recently examined multisensory speech perception in a cohort of ASD and typically-developing (TD) children between the ages of 8 and 17 (Stevenson et al., 2014a). Consistent with our prior work, ASD children showed an increased width to their multisensory TBW, as well as differences in the degree to which they fused concordant audiovisual speech stimuli. Children with ASD also exhibited a strong relationship between the strength of their perceptual binding on concordant audiovisual trials and the communication subscore of the ADOS, with lower (i.e., more typical) scores being associated with greater binding (Woynaroski et al., 2013) (Fig. 5). Thus, the temporal acuity of individuals’ multisensory binding is directly correlated with their abilities to integrate audiovisual speech, and the correlation between multisensory temporal processing and ADOS communication scores suggests that this relationship may extend into clinical manifestations of ASD. Although this work suggests important links between some of the key diagnostic features of autism and multisensory function, much more needs to be done in order to fully elucidate the nature of these relationships.

In addition to the recent data linking multisensory temporal acuity, speech integration and ADOS communication scores in the ASD populations, ongoing research has begun to examine these relationships to autistic-like traits in the general population (referred to as the broader or extended phenotype). Autistic traits are found to varying degrees in the population at large, and can be indexed through scales such as the Autism-spectrum Quotient (ASQ; Baron-Cohen et al., 2001) or the Broad Autism Phenotype Questionnaire (Hurley et al., 2007). These traits can then be correlated with any number of perceptual measures. For example, a recent study by Donohue, Darling, and Mitroff (2012) showed that the point of subjective simultaneity (PSS) varies relative to the (non-clinical) level of autistic traits an individual exhibits. The PSS, that point in time in which an individual perceives a visual and auditory event to be absolutely synchronous, tends to be observed when the auditory stimulus component slightly lags the visual component, reflecting the statistics of the natural environment (i.e. auditory information travels more slowly when compared with visual information). Individuals showing greater levels of autistic traits however, tend to have PSS measurements closer to absolute synchrony, reflecting a decrease in adaptation to the statistics in the external environment.

As described earlier, the cortex of the posterior superior temporal sulcus (pSTS) has been implicated as a major node in the computation of multisensory temporal relations. Hence, with the wealth of evidence suggestive of alterations in multisensory temporal function with autism, a logical biological basis for these differences would be changes in the structure and/or function of pSTS. Indeed, some of the most characteristic structural alterations in the brains of those with autism are differences in gray and white matter associated with the pSTS (Boddaert et al., 2004, Boddaert et al., 2009, Levitt et al., 2003, Zilbovicius et al., 2000, Zilbovicius et al., 2006). Furthermore, a number of functional studies (i.e., fMRI) have pointed to differences in the activation patterns within pSTS in autism, as has work looking at both functional and structural connectivity of the pSTS (Amaral et al., 2008, Barnea-Goraly et al., 2004, Keller et al., 2007, Ke et al., 2009, Lee et al., 2007, Minshew and Williams, 2007). Finally, our lab has shown that training that is focused on improving multisensory temporal acuity results in changes in activation and connectivity in a network centered on the pSTS (Powers et al., 2012). Collectively, these studies suggest that changes in pSTS in individuals with autism may represent the neural bases for altered multisensory temporal function, and may be a key node in networks responsible for the changes in social and communicative function.

Although autism represents the clinical condition in which multisensory function has been best characterized, evidence suggests that multisensory deficits, and specifically those in the temporal domain, are not unique to autism. Both sensory and multisensory changes have been found to accompany dyslexia, a reading disability in which affected individuals have profound reading difficulties in the background of normal or even above-normal intelligence. Like with autism, numerous neurobiological theories have been espoused for dyslexia, with most being centered on the substantial differences in phonological processing seen in these individuals. Although many of these theories are centered on changes in brain structures responsible for the processing of phonology and phonological relations (e.g., see Peterson and Pennington, 2012, Ramus, 2003, Ramus et al., 2003, Shaywitz and Shaywitz, 2005, Shaywitz and Shaywitz, 2008, others have suggested that these phonological deficits may be a result of processing difficulties at earlier stages. One of the most well-developed of these views centers on the magnocellular layers of the thalamus (Livingstone et al., 1991, Stein, 2001). In this view, selective deficits in the magnocellular visual stream, which subserves the processing of motion, play a key role in dyslexia. Supporting evidence for this theory comes from reports of abnormal eye movements in dyslexia, and from altered activation patterns in areas of the cerebral cortex specialized for processing stimulus motion (Eden et al., 1996).

The evidence for changes in both visual and auditory function in dyslexia is suggestive that it may be fruitful to consider the disorder in a more pansensory or multisensory framework. Indeed, some of the original clinical descriptions of dyslexia from the neurologist Samuel Orton are rife with multisensory references (Henry, 1998), and to date the most widely adopted intervention approach, the Orton–Gillingham method, is founded on multisensory principles (Oakland et al., 1998). In addition, several early studies of reading disabled and reading delayed individuals found changes in cross-modal (visual–auditory) temporal function, consistent with a multisensory contribution to reading dysfunction (Birch and Belmont, 1964, Muehl and Kremenak, 1966). In order to attribute a specific multisensory contribution to the disorder, however, it is first necessary to show that the nature of the multisensory changes cannot be ascribed simply to changes in unisensory function. Stated a bit differently, it would not be terribly surprising (or interesting) to see multisensory changes accompanying changes in visual (and/or auditory) function. Of interest is whether these changes go beyond those that can be predicted based on unisensory differences.

In an effort to examine specific multisensory alterations in dyslexia, we adopted a multisensory version of the familiar and frequently employed visual temporal order judgment (TOJ) task. Prior work in typical subjects had found that the introduction of a pair of task-irrelevant sounds during performance of the visual TOJ task could improve performance, most notably when the second sound lagged the appearance of the second light (Morein-Zamir, Soto-Faraco, & Kingstone, 2003). Taking advantage of this task, we were able to show striking differences between dyslexic and typical readers – specifically in the time window within which the second auditory stimulus could enhance visual performance (Fig. 6) (Hairston et al., 2005). Dyslexic readers received benefits from this sound over intervals more than twice as long as typical readers, suggesting that they are “binding” visual and auditory stimuli over unusually long periods of time. We speculate that such an extended TBW will present substantial difficulties for the construction of strong reading representations, in that it will present greater ambiguity as to which auditory elements of the written word (i.e., phonemes) belong with which visual elements (i.e., graphemes). In support of this, EEG studies have shown that as readers progress to fluency, letters and speech-sounds are combined early and automatically in the auditory association cortex, and that this processing is strongly dependent on the relative timing of the paired stimuli (Froyen et al., 2008, Froyen et al., 2009). Furthermore, it was found that for dyslexic readers, this progression to automaticity failed to take place (Froyen, Willems, & Blomert, 2011).

Additional evidence that sits outside of the domain of temporal function has been gathered in support of multisensory alterations in dyslexia. For example, deficits in spatial attention to both visual and auditory stimuli have been linked to phonological skills in dyslexia (Facoetti et al., 2010). In addition, Blau et al. (2009) have shown using fMRI that dyslexic readers underactivate regions of the superior temporal cortex when binding the auditory and visual components of a speech signal. As highlighted earlier, the cortex surrounding the pSTS is a critical node for the convergence of auditory and visual information, and appears to play a key role in the temporal binding of these signals. Indeed, the pSTS and its associated gyrus (the superior temporal gyrus) have been implicated as key regions of difference between typical and dyslexic readers (e.g., see Blau et al., 2010, Maisog et al., 2008, Richlan et al., 2013, Rimrodt et al., 2009, Steinbrink et al., 2008). Overall, these studies point to an important role for multisensory function in dyslexia, but much more work needs to be done to better understand how these changes ultimately result in poor reading performance (Wallace, 2009).

Schizophrenia is a complex psychiatric disorder best characterized by changes in thought and emotional reactivity. Frequently accompanying schizophrenia is delusions and hallucinations, with the latter resulting in research into the nature of sensory (i.e., auditory) processing differences and their contributions to the cognitive changes seen in schizophrenia (Behrendt and Young, 2004, Behrendt, 2006, Brenner et al., 2009, Carvill, 2001, Freedman et al., 2003, Hughes et al., 2013, Javitt, 2009). Although these studies have indeed highlighted changes in auditory and visual processes and cortical organization in schizophrenia, no clear picture as to how sensory dysfunction contributes to the overall schizophrenia phenotype has emerged. Nonetheless, as for autism and dyslexia, the presence of these sensory changes across multiple modalities begs for an examination of multisensory function.

Clinical reports have long suggested changes in multisensory function in schizophrenia, most notably seen in the ability to match stimuli across the different sensory modalities (i.e., cross-modal matching, see Maurage and Campanella (2013)). More empirically directed work subsequently found there to be deficits in the integration of audiovisual stimuli in a schizophrenia cohort, and that this deficit appeared to be restricted to speech-related audiovisual stimuli and was amplified under noisy conditions (de Gelder et al., 2005, de Jong et al., 2009, Pearl et al., 2009, Ross et al., 2007, Szycik et al., 2009). A subset of these studies also revealed differences in multisensory performance specifically when the tasks indexed the emotional valence of the auditory (voice) and visual (face) stimuli. Other work has suggested the presence of deficits in lower-level multisensory integration, specifically in demonstrating reduced facilitation of reaction times on a visual–auditory target detection task (Williams et al., 2010). In a recent EEG study comparing between those with schizophrenia and controls, it was found that the neural signatures associated with typical audiovisual integration were absent or comprised in the schizophrenic patients (Stekelenburg et al., 2013).

One crucial issue as it relates to the establishment of specific multisensory deficits in schizophrenia (or in any other clinical condition) is to show that changes in performance and/or perception are either unique to the multisensory conditions, or cannot be predicted based on the changes seen in unisensory function. For this reason, it is essential that measures of multisensory function are contrasted against unisensory measures. Although such unisensory-multisensory contrasts are becoming increasingly common, some of the earlier studies failed to test for unisensory changes, making it difficult to interpret the differences in multisensory performance.

Numerous prior studies have suggested that in addition to sensory-based problems, individuals with schizophrenia have alterations in temporal perception (Carroll et al., 2008, Hughes et al., 2013, Martin et al., 2013, Shin et al., 2010). Indeed, prior work has merged these areas of inquiry, and has shown changes in both unisensory and multisensory temporal perception in schizophrenia, which manifest as a lessened acuity in judging the simultaneity between visual, auditory and combined visual–auditory stimulus pairs (Foucher et al., 2007, Martin et al., 2013). In an effort to follow up on this work with an emphasis on the TBW and on the specificity of these effects for multisensory integration, we have recently embarked on a study designed to detail the nature of these changes and their relationships to the constellation of clinical symptoms. Although preliminary, this work is suggestive of changes in multisensory temporal function that we believe may be important factors in the schizophrenia phenotype.

As alluded to in the prior section, ongoing work in our laboratory has focused on using approaches grounded in perceptual plasticity to train sensory and multisensory systems. In addition to its application for those wearing cochlear implants, we believe that such methods also hold promise for clinical conditions such as autism and schizophrenia, most notably in their possible utility for improving sensory and multisensory temporal acuity. As highlighted in an earlier section, we have successfully trained individuals to narrow the width of their TBW (Powers et al., 2009, Stevenson et al., 2013), with these changes accompanied by changes in a brain network centered on the pSTS (Fig. 7) (Powers et al., 2012). Most encouraging in these normative studies was the finding that those who benefited the most from training (i.e., showed the largest changes in the size of their TBW) were those for whom the TBW was the largest prior to training (Powers et al., 2009, Stevenson et al., 2013). Hence, our findings of enlarged multisensory TBW in autism, dyslexia and schizophrenia suggest that these individuals may be highly susceptible to perceptual training methods.

In preliminary work in autism, we have shown this to be the case, with several days of training resulting in a significant narrowing of the TBW. Although very exciting, this work needs to be extended to show that this training results in changes beyond the trained task and domain. We are encouraged by our results in our typical cohort, which have shown that training using low-level stimuli (i.e., flashes and beeps) on one task (i.e., simultaneity judgment) can result in changes in the TBW for the processing of higher-level (i.e., speech) stimuli in the context of a different task (i.e., perceptual fusions as indexed by the McGurk effect). The presence of such generalization is extraordinarily exciting, but must now be extended to see if these training regimens can engender meaningful change in measures of real world function – such as improvements in social skills and communication. Although still in their early stages, we feel that these perceptual plasticity-based approaches hold great promise as potential tools that can be incorporated into behaviorally-based remediation methods.

Section snippets

Concluding remarks

Sensory and multisensory dysfunction accompanies many developmental disabilities. Although widely acknowledged, the presence of these deficits is often overlooked from the perspective of how they can inform and contribute to the characteristics that are considered defining for the disorder. Using autism as an example, it is only with the recent update to the DSM-5 that sensory problems are considered a core feature of the disorder. Even with this important acknowledgment, little empirical

References (322)

  • N. Boddaert

    Superior temporal sulcus anatomical abnormalities in childhood autism: a voxel-based morphometry MRI study

    Neuroimage

    (2004)
  • A. Bonnel

    Enhanced pure-tone pitch discrimination among persons with autism but not Asperger syndrome

    Neuropsychologia

    (2010)
  • G.A. Calvert et al.

    Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex

    Current Biology

    (2000)
  • G.A. Calvert

    Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect

    Neuroimage

    (2001)
  • C. Cappe

    Selective integration of auditory–visual looming cues by humans

    Neuropsychologia

    (2009)
  • C.A. Carroll

    Temporal processing dysfunction in schizophrenia

    Brain and Cognition

    (2008)
  • Y. Chen

    Enhanced local processing of dynamic visual information in autism: evidence from speed discrimination

    Neuropsychologia

    (2012)
  • B. de Gelder

    Multisensory integration of emotional faces and voices in schizophrenics

    Schizophrenia Research

    (2005)
  • J.J. de Jong

    Audiovisual emotion recognition in schizophrenia: reduced integration of facial and vocal affect

    Schizophrenia Research

    (2009)
  • A. Diederich et al.

    Crossmodal interaction in speeded responses: time window of integration model

    Progress in Brain Research

    (2009)
  • I. Dinstein

    Unreliable evoked responses in autism

    Neuron

    (2012)
  • O. Doehrmann et al.

    Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration

    Brain Research

    (2008)
  • J. Driver et al.

    Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments

    Neuron

    (2008)
  • J.H. Foss-Feig et al.

    Tactile responsiveness patterns and their association with core features in autism spectrum disorders

    Research in Autism Spectrum Disorders

    (2012)
  • J.R. Foucher

    Low time resolution in schizophrenia lengthened windows of simultaneity for visual, auditory and bimodal stimuli

    Schizophrenia Research

    (2007)
  • J.J. Foxe

    Multisensory auditory–somatosensory interactions in early cortical processing revealed by high-density electrical mapping

    Brain Research. Cognitive Brain Research

    (2000)
  • M.A. Frens et al.

    Visual–auditory interactions modulate saccade-related activity in monkey superior colliculus

    Brain Research Bulletin

    (1998)
  • U. Frith et al.

    Autism: beyond theory of mind

    Cognition

    (1994)
  • D. Froyen

    Cross-modal enhancement of the MMN to speech-sounds indicates early and automatic integration of letters and speech-sounds

    Neuroscience Letters

    (2008)
  • M.J. Gandal

    Validating gamma oscillations and delayed auditory responses as translational biomarkers of autism

    Biological Psychiatry

    (2010)
  • D.H. Geschwind et al.

    Autism spectrum disorders: developmental disconnection syndromes

    Current Opinion in Neurobiology

    (2007)
  • D. Alais et al.

    Multisensory processing in review: from physiology to behaviour

    Seeing and Perceiving

    (2010)
  • B.L. Allman et al.

    Multisensory processing in unimodal neurons: cross-modal subthreshold auditory effects in cat extrastriate visual cortex

    Journal of Neurophysiology

    (2007)
  • R.A. Almeida

    Visual search targeting either local or global perceptual processes differs as a function of autistic-like traits in the typically developing population

    Journal of Autism and Developmental Disorders

    (2013)
  • J.C. Alvarado

    Multisensory versus unisensory integration: contrasting modes in the superior colliculus

    Journal of Neurophysiology

    (2007)
  • J.C. Alvarado

    Multisensory integration in the superior colliculus requires synergy among corticocollicular inputs

    Journal of Neuroscience

    (2009)
  • A. Amedi

    Functional imaging of human crossmodal identification and object recognition

    Experimental Brain Research

    (2005)
  • Diagnostic and Statistical Manual of Mental Disorders (DSM-5)

    (2013)
  • Autism

    Prevalence of autism spectrum disorders—autism and developmental disabilities monitoring network, 14 sites, United States, 2008

    Morbidity and Mortality Weekly Report. Surveillance Summaries

    (2012)
  • L.E. Bahrick

    Intermodal perception and selective attention to intersensory redundancy: implications for typical social development and autism

    (2010)
  • L.E. Bahrick et al.

    Intersensory redundancy guides attentional selectivity and perceptual learning in infancy

    Developmental Psychology

    (2000)
  • L.E. Bahrick et al.

    Multisensory processing in autism spectrum disorders: intersensory processing disturbance as a basis for atypical development

  • S. Baron-Cohen

    The autism-spectrum quotient (AQ): evidence from Asperger syndrome/high-functioning autism, malesand females, scientists and mathematicians

    Journal of Autism and Developmental Disorders

    (2001)
  • M.S. Beauchamp et al.

    fMRI-Guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect

    Journal of Neuroscience

    (2010)
  • J.M. Bebko

    Discrimination of temporal synchrony in intermodal events by children with autism and children with developmental disabilities without autism

    Journal of Child Psychology and Psychiatry

    (2006)
  • J.M. Bebko et al.

    The McGurk effect in children with autism and Asperger syndrome

    Autism Research

    (2013)
  • R.P. Behrendt

    Dysregulation of thalamic sensory transmission in schizophrenia: neurochemical vulnerability to hallucinations

    Journal of Psychopharmacology

    (2006)
  • R.P. Behrendt et al.

    Hallucinations in schizophrenia, sensory impairment, and brain disease: a unifying model

    Behavioral and Brain Sciences

    (2004)
  • A.H. Bell

    The influence of stimulus properties on multisensory processing in the awake primate superior colliculus

    Canadian Journal of Experimental Psychology

    (2001)
  • P. Bertelson et al.

    Cross-modal bias and perceptual fusion with auditory–visual spatial discordance

    Perception and Psychophysics

    (1981)
  • Cited by (221)

    View all citing articles on Scopus
    View full text