Swipe om te navigeren naar een ander artikel
Presenting target and non-target information in different modalities influences target localization if the non-target is within the spatiotemporal limits of perceptual integration. When using auditory and visual stimuli, the influence of a visual non-target on auditory target localization is greater than the reverse. It is not known, however, whether or how such perceptual effects extend to goal-directed behaviours. To gain insight into how audio-visual stimuli are integrated for motor tasks, the kinematics of reaching movements towards visual or auditory targets with or without a non-target in the other modality were examined. When present, the simultaneously presented non-target could be spatially coincident, to the left, or to the right of the target. Results revealed that auditory non-targets did not influence reaching trajectories towards a visual target, whereas visual non-targets influenced trajectories towards an auditory target. Interestingly, the biases induced by visual non-targets were present early in the trajectory and persisted until movement end. Subsequent experimentation indicated that the magnitude of the biases was equivalent whether participants performed a perceptual or motor task, whereas variability was greater for the motor versus the perceptual tasks. We propose that visually induced trajectory biases were driven by the perceived mislocation of the auditory target, which in turn affected both the movement plan and subsequent control of the movement. Such findings provide further evidence of the dominant role visual information processing plays in encoding spatial locations as well as planning and executing reaching action, even when reaching towards auditory targets.
Log in om toegang te krijgen
Met onderstaand(e) abonnement(en) heeft u direct toegang:
Calvert, G. A., Spence, C., & Stein, B. E. (2004). The Handbook of Multisensory Processes. Cambridge: MIT Press.
Cheng, D. T., Luis, M., & Tremblay, L. (2008). Randomizing visual feedback in manual aiming: reminiscence of the previous trial condition and prior knowledge of feedback availability. Experimental Brain Research, 189, 403–410.
Colonius, H., & Diederich, A. (2010). The optimal time window of visual-auditory integration: a reaction time analysis. Frontiers in Integrative Neuroscience, 4(11), 1–8.
Desanghere, L., & Marotta, J. J. (2008). The specificity of learned associations in visuomotor and perceptual processing. Experimental Brain Research, 187, 595–601.
Diederich, A., & Colonius, H. (1987). Intersensory facilitation in the motor component? A reaction time analysis. Psychological Research, 49, 23–29. CrossRef
Diederich, A., & Colonius, H. (2004). Modelling the time course of multisensory interaction in manual and saccadic responses. In G. Calvert, C. Spence, & B. E. Stein (Eds.), Handbook of Multisensory Processes (pp. 395–408). Cambridge: MIT Press.
Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415(429), 433.
Glazebrook, C. M., Dhillon, V. P., Keetch, K. M., Lyons, J., Amazeen, E., Weeks, D. J., & Elliott, D. (2005). Perception-action and the Muller-Lyer illusion: amplitude or endpoint bias? Experimental Brain Research, 160, 71–78.
Glover, S. R., & Dixon, P. (2001). Dynamic illusion effects in a reaching task: evidence for separate visual representations in the planning and control of reaching. Journal of Experimental Psychology: Human Perception and Performance, 27, 560–572.
Heath, M. (2005). Role of limb and target vision in the online control of memory-guided reaches. Motor Control, 9, 281–309. PubMed
Ho, C., & Spence, C. (2009). Using peripersonal warning signals to orient a driver’s gaze. Human Factors: the Journal of the Human Factors and Ergonomics Society, 51, 539–556. CrossRef
Kagerer, F. A., & Contreras-Vidal, J. L. (2009). Adaptation of sounds localization induced by rotated visual feedback in reaching movements. Experimental Brain Research, 193, 315–321.
Keetels, M., & Vroomen, J. (2011). Sound affects the speed of visual processing. Journal of Experimental Psychology: Human Perception and Performance, 37, 699–708.
Körding, K. P., Beierholm, U., Ma, W. J., Quartz, S., Tenenbaum, J. B., & Shams, L. (2007). Causal Inference in Multisensory Perception. PLoS ONE, 2(9), e943.
Malcom, M. P., Massie, C., & Thaut, M. (2009). Rhythmic Auditory-Motor entrainment improves hemiparetic arm kinematics during reaching movements: a pilot study. Topics in Stroke Rehabilitation, 16(1), 69–79. CrossRef
Messier, J., & Kalaska, J. F. (1999). Comparison of variability of initial kinematics and endpoints of reaching movements. Experimental Brain Research, 125, 139–152.
Nowak, D. A., Tisch, S., Hariz, M., Limousin, P., Topka, H., & Rothwell, J. C. (2001). Sensory timing cues improve akinesia of grasping movement in Parkinson’s Disease: a comparison to the effects of subthalamic nucleus stimulation. Movement Disorders, 21, 166–172. CrossRef
Recanzone, G. H. (1998). Rapidly induced auditory plasticity: the ventriloquism aftereffect. Proceedings of the National Academy of Science, 95, 869–875. CrossRef
Santangelo, V., Van der Lubbe, R. H. J., Belardinelli, M. O., & Postma, A. (2008). Multisensory integration affects ERP components elicited by exogenous cues. Experimental Brian Research, 185, 269–277. CrossRef
Shams, L., Ma, W. J., & Beierholm, U. (2005). Sound-induced flash illusion as optimal percept. NeuroReport, 16(1923), 1927.
Thomas, G. J. (1941). Experimental study of the influence of vision on sound localization. Journal of Experimental Psychology, 28, 163–177. CrossRef
Tipper, S. P., Lortie, C., & Baylis, G. C. (1992). Selective reaching: evidence for action-centered attention. Journal of Experimental Psychology: Human Perception and Performance, 18, 891–905. PubMed
Welsh, T. N., & Pratt, J. (2008). Actions modulate attentional capture. Quarterly Journal of Experimental Psychology, 61, 968–976. CrossRef
Whitwell, R. L., Milner, A. D., & Goodale, M. A. (2014). The two visual systems hypothesis: new challenges and insights from visual form agnosic patient DF. Frontiers in Neurology, 5, 1–8. CrossRef
- The processing of visual and auditory information for reaching movements
Cheryl M. Glazebrook
Timothy N. Welsh
- Springer Berlin Heidelberg