Swipe om te navigeren naar een ander artikel
Understanding actions based on either language or action observation is presumed to involve the motor system, reflecting the engagement of an embodied conceptual network. We examined how linguistic and gestural information were integrated in a series of cross-domain priming studies. We varied the task demands across three experiments in which symbolic gestures served as primes for verbal targets. Primes were clips of symbolic gestures taken from a rich set of emblems. Participants responded by making a lexical decision to the target (Experiment 1), naming the target (Experiment 2), or performing a semantic relatedness judgment (Experiment 3). The magnitude of semantic priming was larger in the relatedness judgment and lexical decision tasks compared to the naming task. Priming was also observed in a control task in which the primes were pictures of landscapes with conceptually related verbal targets. However, for these stimuli, the amount of priming was similar across the three tasks. We propose that action observation triggers an automatic, pre-lexical spread of activation, consistent with the idea that language–gesture integration occurs in an obligatory and automatic fashion.
Log in om toegang te krijgen
Met onderstaand(e) abonnement(en) heeft u direct toegang:
Bernardis, P., & Caramelli, N. (2009). Meaning in words, gestures, and mental images. In Proceedings of the 31st annual conference of the Cognitive Science Society (pp. 1693–1697).
Blair, I. V., Urland, G. R., & Ma, J. E. (2002). Using Internet search engines to estimate word frequency. Behavior Research Methods, Instruments, & Computers, 34(2), 286–290. CrossRef
Brookes, H. (2005). What gestures do: some communicative functions of quotable gestures in conversations among Black urban South Africans. Journal of pragmatics, 37(12), 2044–2085. CrossRef
Collins, A. M., & Loftus, E. F. (1975). A spreading-activation theory of semantic processing. Psychological Review, 82(6), 407–428. CrossRef
Duscherer, K., & Holender, D. (2005). The role of decision biases in semantic priming effects. Swiss Journal of Psychology, 64(4), 249–258. CrossRef
Frick-Horbury, D., & Guttentag, R. E. (1998). The effects of restricting hand gesture production on lexical retrieval and free recall. The American Journal of Psychology, 111(1), 43–62. CrossRef
Goodwyn, S. W., Acredolo, L. P., & Brown, C. A. (2000). Impact of symbolic gesturing on early language development. Journal of Nonverbal Behavior, 24(2), 81–103. CrossRef
Kendon, A. (1994). Do gestures communicate? A review. Research on language and social interaction, 27(3), 175–200.
Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge University Press, Cambridge.
Kita, S. (2000). How representational gestures help speaking. In D. McNeill (Ed.), Language and gesture (pp. 162–185). Cambridge: Cambridge University Press. CrossRef
Kita, S., & Özyürek, A. (2003). What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking. Journal of Memory and Language, 48(1), 16–32. CrossRef
Krauss, M., Chen, Y., & Gottesman, R. F. (2000). Lexical gestures and lexical access: A process model. In D. McNeill (Ed.), Language and gesture (pp. 261–283). New-York: Cambridge. CrossRef
Krauss, R., & Hadar, U. (1999). The role of speech-related arm/hand gesture in word retrieval. In L. S. Messing & R. Campbell (Eds.), Gesture, speech, and sign (pp. 93–116). Oxford: Oxford University Press. CrossRef
Lakoff, G. (1987). Women, fire, and dangerous things. Chicago: University of Chicago Press.
Lakoff, G., & Johnson, M. (1999). Philosophy in the flesh: The embodied mind and its challenge to Western thought. New York: Basic books.
McNamara, T. P. (2005). Semantic priming: Perspectives from memory and word recognition. New York: Psychology Press.
McNeill, D. (1992). Hand and mind. What the hands reveal about thought. Chicago: University of Chicago Press.
Morrel-Samuels, P., & Krauss, R. M. (1992). Word familiarity predicts temporal asynchrony of hand gestures and speech. Learning, Memory, 18(3), 615–622.
Neely, J. H. (1991). Semantic priming effects in visual word recognition: A selective review of current findings and theories. In Basic processes in reading: Visual word recognition, 11.
Neely, J. H., & Keefe, D. E. (1989). Semantic context effects on visual word processing: A hybrid prospective/retrospective processing theory. In G. H. Bower (Eds.), The psychology of learning and motivation: Advances in research and theory (Vol. 24, pp. 207–248). New York: Academic Press.
Rauscher, F. H., Krauss, R. M., & Chen, Y. (1996). Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science, 7(4), 226–231. CrossRef
Rimé, B., & Schiaratura, L. (1991). Gesture and speech. In R. Feldman & B. Rimé (Eds.), Fundamentals of nonverbal behavior (pp. 239–281). Cambridge: Cambridge University Press.
Schegloff, E. A. (1984). On some gestures’ relation to talk. In J. M. Atkinson & J. Heritage (Eds.), Structures of social action: Studies in conversation analysis (pp. 266–296). Cambridge: Cambridge University Press.
Wu, Y. C., & Coulson, S. (2007). Iconic gestures prime related concepts: An ERP study. Psychonomic Bulletin & Review, 14(1), 57–63. CrossRef
Xu, J., Gannon, P. J., Emmorey, K., Smith, J. F., & Braun, A. R. (2009). Symbolic gestures and spoken language are processed by a common neural system. Proceedings of the National Academy of Sciences, 106(49), 20664–20669. CrossRef
Zwaan, R. A. (2004). The immersed experiencer: Towards an embodied theory of language comprehension. In B. Ross (Ed.), The Psychology of Learning and Motivation (Vol. 44, pp 35–62). San Diego: Academic Press.
- Beyond words: evidence for automatic language–gesture integration of symbolic gestures but not dynamic landscapes
Richard B. Ivry
- Springer Berlin Heidelberg