Elsevier

Neuropsychologia

Volume 49, Issue 9, July 2011, Pages 2711-2717
Neuropsychologia

250 ms to code for action affordance during observation of manipulable objects

https://doi.org/10.1016/j.neuropsychologia.2011.05.019Get rights and content

Abstract

It is well known that viewing graspable tools (but not other objects) activates motor-related brain regions, but the time course of affordance processing has remained relatively unexplored. In this study, EEG was continuously recorded from 128 scalp sites in 15 right-handed university students while they received stimuli in the form of 150 pictures of familiar non-tool objects and 150 pictures of manipulable tools, matched for size, luminance and perceptual familiarity. To select the 300 images for the study, a wider set of preliminary stimuli was screened for motoric content by 20 judges using a 3-point scale (0 = absent; 2 = strong); pictures that scored below 1.5 or above 0.6 were excluded from the tool and non-tool categories, respectively. Tools and non-tools were presented in random order, interspersed with 25 photos of live plants. Each slide was presented for 1000 ms, with an interval ranging from 1500 to 1900 ms. The task consisted of responding to the photos of plants while ignoring the other stimuli. Both an anterior negativity (210–270 ms) and a centroparietal P300 (550–600 ms) were larger in response to tools than objects, particularly in the left hemisphere. swLORETA inverse solution identified the occipito-temporal cortex (BA19 and BA37) as the most significant source of activity (in the 210–270-ms time window) for both types of visual objects and the left postcentral gyrus (BA3) and the left and right premotor cortex (BA6) as the most significant source of activity for tools only. These data hint at an automatic access to motoric object properties even under conditions in which attention is devoted to other stimulus categories.

Highlights

► 300 objects were presented, matched for perceptual familiarity, size and luminance. ► The ventral stream (BA19 and 37) was similarly activated for all objects. ► Only tools activated the left postcentral gyrus (BA3) and the premotor cortex (BA6). ► Tool perception automatically activates motor affordance as early as 210 ms.

Introduction

Viewing graspable tools, but not other objects, activates motor-related regions in the cerebral cortex, including the posterior middle temporal gyrus, the ventral premotor area and the posterior parietal cortex (Chao & Martin, 2000). It is believed that the sight of an object automatically activates its motoric properties, including its affordance and the representation of the associated motor interaction (e.g., the manual grip needed to grasp a glass of a given size). This information is represented in the inferior parietal lobule and inferior premotor area (Jeannerod, Arbib, Rizzolatti, & Sakata, 1995). One of the first studies of the visual perception of manipulable objects (Grafton, Fadiga, Arbib, & Rizzolatti, 1997) involved silent naming vs. observation of tools and found that the left premotor cortex is activated during tool observation, possibly supporting the motor representation of hand and arm movements. However, this important study lacked the critical control of other object categories (e.g., non-tools). By and large, published neuroimaging studies report that manipulable tools activate anterior brain regions during perception to a greater extent than do non-usable objects. For example, a recent fMRI study (Creem-Regehr & Lee, 2005) demonstrated that viewing graspable tools, but not shapes, activated motor-related regions of the cortex (namely, the posterior middle temporal gyrus, ventral premotor area and posterior parietal cortex). The researchers concluded that the functional identity of graspable objects influences the extent to which they are associated with motor representations. Activation of the premotor cortex has been observed in response to 3D objects (real objects to be silently named and ideally grasped) both with positron emission tomography (PET) (Grafton et al., 1997) and fMRI (Creem-Regehr & Lee, 2005) neuroimaging techniques. In contrast, presentation of tools/animals did not elicit this response in a PET study (Perani et al., 1995) in which 2D drawings of tools or animals were presented in a task requiring a same/different judgement. This difference among studies could be due to differences in stimuli or in task requirements. Again, some authors (Cardellicchio, Sinigaglia, & Costantini, 2011) have related activation of the motor cortex during visual perception of tools and objects with the actual possibility of reaching for the objects (i.e., to the spatial location of the objects and their motor affordances in relation to the observer). In Cardellicchio et al.’ study left primary motor cortex (M1) was stimulated with transcranial magnetic stimulation (TMS) (to enhance its excitability) and motor evoked potentials (MEPs) were recorded while participants observed graspable and non-graspable objects located within or outside their own reachable space. The researchers found larger motor potentials only when graspable (and not non-graspable) objects were within the reachable space (compared with the object lying outside the reachable space).

The time course of affordance coding and the latency stage at which motor associations of an object are accessed during visual perception are issues that have remained relatively unexplored. In a previous ERP study (Proverbio, Del Zotto, & Zani, 2007), we examined brain processing of black-and-white drawings depicting familiar animals compared with familiar objects. At early processing stages (120–180 ms), the right occipital–temporal cortex showed more activation in response to animals than to artifacts, as indexed by posterior N1 response; in contrast, frontal/central N1 (130–160 ms) showed the opposite pattern. In the subsequent processing stage (200–260 ms), the response was stronger to artifacts and usable items at anterior temporal sites. The effect of animal and artifact categorization emerged at ∼150 ms in the right occipital–temporal area, appearing as a stronger response of the ventral stream to animate, homomorphic entities with faces and legs. The larger frontal/central N1 and subsequent temporal activation in response to inanimate objects might reflect the prevalence of a functional rather than perceptual representation of manipulable tools compared with animals. Indeed objects were all man-made usable tools. Although it provided information regarding the time course for processing tools, this study lacked the spatial resolution needed to determine whether motor-related areas are activated during tool perception, because of the lack of LORETA source reconstruction.

This issue was investigated using a combined ERP/fMRI approach (Handy, Grafton, Shroff, Ketay, & Gazzaniga, 2003) wherein the processing of tools and non-tools was compared while the subjects were engaged in a spatial attention task. However, the electrophysiological results were somewhat difficult to interpret. In that study participants viewed two task-irrelevant objects on the left and the right of fixation point while waiting for a target to be presented in one of the two locations. Objects could be: both tools, a tool and a non-tool, or two non-tools. ERP data showed that spatial attention was systematically drawn to the location preceded by a tool, only in the right visual field (RVF). Furthermore, combined fMRI recording demonstrated that the premotor and parietal cortices were specifically activated by tools presented during tool-right (but not tool-left) trials. Given that the tools used were mostly long and narrow objects, the authors considered whether the RVF bias for visuo-motor processing resulted from cerebral asymmetries in spatial frequency processing, with the left hemisphere being dominant in processing high spatial frequencies (Proverbio, Zani, & Avella, 1997). To avoid this potential complication, in the present study we ensured that tool and non-tool objects were matched in stimulus spatial frequency distribution, luminance, size and perceptual familiarity.

In the present study, we wished to determine the time course of object visual processing and to establish at what latency stage action affordance is computed. This was accomplished by comparing neural processing of tools and non-manipulable objects. Because stimuli were matched for their perceptual characteristics (i.e., size, luminance and visual familiarity), thus differing only in their degree of motoric associations, we hypothesized that any difference in time-locked bioelectrical activity between the two classes of objects was ascribable to the different degree of motoric associations. Furthermore, since attention had to be paid to other stimulus categories (target stimuli were live plants, while other objects had to be ignored), we presumed that any index of action-related processing reflected an automatic (not required) access to tool motoric properties.

Section snippets

Participants

Fifteen healthy right-handed university students (8 men and 7 women) participated in this study as unpaid volunteers. They earned academic credits for their participation. The mean age was 22.8 and ranged from 20 to 27 years. All had normal or corrected-to-normal vision and reported no history of neurological illness or drug abuse. All participants were right-handed with right ocular dominance as determined by the Italian version of the Edinburgh Handedness Inventory, a laterality preference

Anterior negativity (210–270 ms)

This component reached its maximum amplitude at the anterior frontal electrode sites (Fig. 2) and was larger in response to tools than objects. This was particularly evident over the left hemisphere, as shown by the significant interaction of stimulus category × hemisphere (F1,13 = 5.59; p < 0.05) and confirmed by post hoc comparisons (RH: tools = −2.68 and non-tools = −2.1 μV; LH: tools = −2.79 and non-tools = −2.09 μV). To locate the possible neural source of the action affordance effect, two different

Conclusions

This study provides evidence that action affordance is coded within the first 250 ms during the observation of manipulable objects, shown as an increased anterior negative potential elicited by tools vs. objects, that had intracranial neural generators localized by swLORETA in the left somato-sensory cortex (precentral gyrus, BA3) and the left and right premotor areas (BA6), with a left hemispheric asymmetry. An additional source of activity related to object processing was shared by both

Conflict of interest statement

None declared.

Acknowledgements

This research was supported by 2010 FAR grants from the University of Milano-Bicocca to AMP. AR was supported in part by “Dote ricercatori”: FSE, Regione Lombardia. The authors are very grateful to Alberto Zani, Federica Riva, Mirella Manfredi and Nicola Crotti for their kind support.

Cited by (0)

View full text