Elsevier

NeuroImage

Volume 195, 15 July 2019, Pages 232-242
NeuroImage

Distinct visuo-motor brain dynamics for real-world objects versus planar images

https://doi.org/10.1016/j.neuroimage.2019.02.026Get rights and content

Abstract

Ultimately, we aim to generalize and translate scientific knowledge to the real world, yet current understanding of human visual perception is based predominantly on studies of two-dimensional (2-D) images. Recent cognitive-behavioral evidence shows that real objects are processed differently to images, although the neural processes that underlie these differences are unknown. Because real objects (unlike images) afford actions, they may trigger stronger or more prolonged activation in neural populations for visuo-motor action planning. Here, we recorded electroencephalography (EEG) when human observers viewed real-world three-dimensional (3-D) objects or closely matched 2-D images of the same items. Although responses to real objects and images were similar overall, there were critical differences. Compared to images, viewing real objects triggered stronger and more sustained event-related desynchronization (ERD) in the μ frequency band (8–13 Hz) – a neural signature of automatic motor preparation. Event-related potentials (ERPs) revealed a transient, early occipital negativity for real objects (versus images), likely reflecting 3-D stereoscopic differences, and a late sustained parietal amplitude modulation consistent with an ‘old-new’ memory advantage for real objects over images. Together, these findings demonstrate that real-world objects trigger stronger and more sustained action-related brain responses than images do. The results highlight important similarities and differences between brain responses to images and richer, more ecologically relevant, real-world objects.

Introduction

Current knowledge of the cognitive and neural basis of human visual perception has been established predominantly by studies that have used stimuli in the form of planar images. Although this approach has yielded important insights into image vision, the human brain presumably has evolved to allow us to perceive and interact with real objects in naturalistic environments (Gibson, 1979). Despite the fundamental differences between real objects and images, the overarching assumption in cognitive neuroscience research has been that images are equivalent to their real-world counterparts. This basic assumption is rarely recognized or acknowledged. For example, many report studying real-world or graspable objects (Brady et al., 2008, 2016; Handy et al., 2003; Konkle et al., 2010; Konkle and Oliva, 2011, 2012; Lee et al., 2012; McNair et al., 2017; Nako et al., 2015; Khaligh-Razavi et al., 2018), yet representations of objects are neither real, nor do they offer genuine affordances.

Emerging evidence from cognitive psychology has begun to challenge the assumption of equivalence between real objects and images. Compared to 2-D images of objects, real-world objects elicit different gaze patterns in infants (Gerhard et al., 2016), facilitate object recognition (Chainay and Humphreys, 2001; Humphrey et al., 1994), enhance memory (Snow et al., 2014), increase attentional capture (Gomez et al., 2017), and bias valuation and decision-making (Romero et al., 2018). These unique effects of real objects on behavior are thought to be driven at the neural level by format-specific increases in the strength and/or duration of activation in visuo-motor networks involved in automatic planning of motor actions (Cisek and Kalaska, 2010; Gallivan et al., 2009; Gomez et al., 2017). However, no studies to date have tested this hypothesis. Although evidence from fMRI suggests that the format in which a stimulus is displayed influences neural responses across successive object presentations (Snow et al., 2011), this leaves open the critical question of whether, and how, real objects modulate cortical brain dynamics at the level of individual occurrences, independently of previous presentations (Cisek and Kalaska, 2010; Gallivan et al., 2009, 2011a; Gomez et al., 2017).

Unlike fMRI, in which blood oxygenation level dependent (BOLD) contrast detects vascular responses that lag the underlying neural events by seconds (Logothetis et al., 2001), electroencephalography (EEG) measures electrical changes at the surface of the scalp with millisecond-precision and can therefore provide fine-grained information about the time-course of cortical dynamics (e.g., Makeig et al., 2002). The EEG signal can be decomposed to reveal frequency-specific changes associated with cognitive processes (Basar et al., 1999; Klimesch, 1999). One such process is the transformation of visual object information into action representations, which is reflected by desynchronization of the μ (‘mu’) rhythm (8–13Hz) (Pineda, 2005). Desynchronization (including α, μ and β rhythms) is a reliable correlate of activated cortical networks (Pfurtscheller, 2001) and is directly related to fMRI BOLD response amplitude (Laufs et al., 2003). The rolandic μ rhythm originates in primary sensorimotor and premotor cortex and is recorded over central electrodes (Pfurtscheller et al., 1997). Typically, μ desynchronization occurs during both preparation and execution of self-initiated hand actions, as well as when hand actions are visually observed or imagined (Hari, 2006; Muthukumaraswamy and Johnson, 2004; Muthukumaraswamy et al., 2004; Pfurtscheller et al., 1997; Pineda, 2005). Observation of images of manipulable objects, such as tools, also elicits desynchronization of the μ rhythm over sensorimotor networks (Proverbio, 2012; Suzuki et al., 2014).

Here, we used EEG to contrast how cortical brain dynamics unfold when right-handed human observers view everyday real-world graspable objects versus 2-D images of the same items. Previous studies have shown that viewing images of graspable objects can automatically trigger motor preparation responses (Proverbio et al., 2011; Proverbio, 2012; Wamain et al., 2016). Given that real objects afford genuine motor actions, whereas images do not, we predicted that real objects would trigger stronger and more prolonged motor preparation signatures compared to 2-D images of the same items. We also predicted that the motor preparation signals for real objects would be stronger in the left hemisphere, contralateral to the dominant (right) hand. To pre-empt the results, we found that although there were similarities in overall neural responses to both stimulus formats, real objects elicited perceptual and neural responses that were distinct from those elicited by 2-D images. Specifically, real objects elicited a stronger and longer desynchronization in the μ frequency band, particularly over the left hemisphere. Real objects (versus 2-D images) also elicited differences in early and late event-related potential (ERP) amplitudes over occipital and parietal electrodes, corresponding to known signatures of stereoscopic disparity (Pegna et al., 2017) and memory (Donaldson and Rugg, 1999; Friedman and Johnson, 2000; Harris and Wilcox, 2009; Rugg and Curran, 2007; Rugg et al., 1998; Schendan and Kutas, 2003; Voss and Paller, 2008), respectively. Importantly, we show that the early difference in ERP amplitudes over occipital areas are dissociable from subsequent amplitude and frequency effects recorded over dorsal cortex. Together, our results confirm that real-world objects trigger neural signatures that are distinct from those of planar images.

Section snippets

Participants

Twenty-four right handed healthy University of Nevada Reno students (mean age ± SD: 25.7 ± 7.5, 10 males) volunteered for the experiment. All participants reported normal or corrected-to-normal vision, no history of neurological impairments, and gave both written and oral informed consent as required by the university Institutional Review Board.

Setup and stimuli

Stimuli consisted of 96 real-world objects and 96 2-D photographs of the same items, including 16 kitchen tools (i.e., knife) and 16 garage tools (i.e.,

Objects are rated as more effortful-to-use when viewed as real exemplars versus images

Effort ratings for each object were correlated across display formats. As apparent from Fig. 1C (left), individual item ratings were evenly distributed from low- (i.e., fork, spoon, spatula) to high-effort (i.e., hammer, handsaw, clamp) objects. Moreover, there was a strong correlation between effort ratings for real objects and images (r = 0.985, p < .001; Fig. 1C, left), reflecting comparable task requirements between display formats. Surprisingly, however, the correlation coefficient was

Discussion

A fundamental assumption in psychology and cognitive neuroscience has been that images of objects, which do not afford action, are processed similarly by the brain as are real-world solid objects. However, there is accumulating behavioral evidence that humans process real-world objects differently to stimuli presented in other display formats, including both 2-D planar (Chainay and Humphreys, 2001; Gerhard et al., 2016; Gomez et al., 2017; Humphrey et al., 1994; Romero et al., 2018; Snow

Acknowledgements

This work was supported by grants from the National Science Foundation (grant number 1632849 to J.C.S.); the National Eye Institute of the National Institutes of Health (grant number R01EY026701 to J.C.S.); and the National Institute of General Medical Sciences of the National Institutes of Health (grant number P20 GM103650). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NSF or NIH. We would like to thank Shane Jacobs for

References (74)

  • T. Kasai et al.

    Event-related brain potentials during selective attention to depth and form in global stereopsis

    Vis. Res.

    (2001)
  • T. Kasai et al.

    Attending to a location in three-dimensional space modulates early ERPs

    Brain Res Cogn Brain Res

    (2003)
  • W. Klimesch

    EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis

    Brain Res Brain Res Rev

    (1999)
  • T. Konkle et al.

    A real-world size organization of object responses in occipitotemporal cortex

    Neuron

    (2012)
  • H. Laufs et al.

    EEG-correlated fMRI of human alpha activity

    Neuroimage

    (2003)
  • S.H. Lee et al.

    Disentangling visual imagery and perception of real-world objects

    Neuroimage

    (2012)
  • S. Makeig

    Auditory event-related dynamics of the EEG spectrum and effects of exposure to tones

    Electroencephalogr. Clin. Neurophysiol.

    (1993)
  • F. Marini et al.

    Dataset of 24-subject EEG recordings during viewing of real-world objects and planar images of the same items

    Data in Brief

    (2019)
  • E. Maris et al.

    Nonparametric statistical testing of EEG- and MEG-data

    J. Neurosci. Methods

    (2007)
  • S.D. Muthukumaraswamy et al.

    Primary motor cortex activation during action observation revealed by wavelet analysis of the EEG

    Clin. Neurophysiol.

    (2004)
  • S.D. Muthukumaraswamy et al.

    Mu rhythm modulation during observation of an object-directed grasp

    Brain Res Cogn Brain Res

    (2004)
  • C. Neuper et al.

    Event-related dynamics of cortical rhythms: frequency-specific features and functional correlates

    Int. J. Psychophysiol.

    (2001)
  • A. Perry et al.

    Motor and attentional mechanisms involved in social interaction – evidence from mu and alpha EEG suppression

    Neuroimage

    (2011)
  • A. Perry et al.

    Mirror activity in the human brain while observing hand movements: a comparison between EEG desynchronization in the μ-range and previous fMRI results

    Brain Res.

    (2009)
  • G. Pfurtscheller

    Functional brain imaging based on ERD/ERS

    Vis. Res.

    (2001)
  • G. Pfurtscheller et al.

    Mu rhythm (de)synchronization and EEG single trial classification of different motor imagery tasks

    Neuroimage

    (2006)
  • G. Pfurtscheller et al.

    Foot and hand area mu rhythms

    Int. J. Psychophysiol.

    (1997)
  • J.A. Pineda

    The functional significance of mu rhythms: translating "seeing" and "hearing" into "doing

    Brain Res Brain Res Rev

    (2005)
  • A.M. Proverbio et al.

    250ms to code for action affordance during observation of manipulable objects

    Neuropsychologia

    (2011)
  • A.M. Proverbio

    Tool perception suppresses 10-12Hz μ rhythm of EEG over the somatosensory area

    Biol. Psychol.

    (2012)
  • C.A. Romero et al.

    The real deal: Willingness-to-pay and satiety expectations are greater for real foods versus their images

    Cortex

    (2018)
  • M.D. Rugg et al.

    Event-related potentials and recognition memory

    Trends Cognit. Sci.

    (2007)
  • M. Suzuki et al.

    Temporal dynamics of neural activity underlying unconscious processing of manipulable objects

    Cortex

    (2014)
  • L. Turella et al.

    Beta band modulations underlie action representations for movement planning

    Neuroimage

    (2016)
  • J.L. Voss et al.

    Brain substrates of implicit and explicit memory: the importance of concurrently acquired neural signals of both memory types

    Neuropsychologia

    (2008)
  • Y. Wamain et al.

    EEG mu rhythm in virtual reality reveals that motor coding of visual objects in peripersonal space is task dependent

    Cortex

    (2016)
  • Y. Wamain et al.

    Conflict between gesture representations extinguishes μ rhythms desynchronization during manipulable object perception: an EEG study

    Biol. Psychol.

    (2018)
  • Cited by (35)

    • The Treachery of Images: How Realism Influences Brain and Behavior

      2021, Trends in Cognitive Sciences
      Citation Excerpt :

      A driving factor in the real-object advantage seems to be actability. An electroencephalography (EEG) study that found that, compared with matched images, real tools invoked a stronger and more sustained neural signature of motor preparation contralateral to the dominant hand of participants [38]. Moreover, a neuroimaging study found different neural representations for real, tangible objects versus similar images during hand actions, particularly when 3D cues conveyed important information for grasping [39].

    View all citing articles on Scopus
    View full text