Swipe om te navigeren naar een ander artikel
Previous studies have found relationships between music-induced movement and musical characteristics on more general levels, such as tempo or pulse clarity. This study focused on synchronization abilities to music of finely-varying tempi and varying degrees of low-frequency spectral change/flux. Excerpts from six classic Motown/R&B songs at three different tempos (105, 115, and 130 BPM) were used as stimuli in this experiment. Each was then time-stretched by a factor of 5% with regard to the original tempo, yielding a total of 12 stimuli that were presented to 30 participants. Participants were asked to move along with the stimuli while being recorded with an optical motion capture system. Synchronization analysis was performed relative to the beat and the bar level of the music and four body parts. Results suggest that participants synchronized different body parts to specific metrical levels; in particular, vertical movements of hip and feet were synchronized to the beat level when the music contained large amounts of low-frequency spectral flux and had a slower tempo, while synchronization of head and hands was more tightly coupled to the weak flux stimuli at the bar level. Synchronization was generally more tightly coupled to the slower versions of the same stimuli, while synchronization showed an inverted u-shape effect at the bar level as tempo increased. These results indicate complex relationships between musical characteristics, in particular regarding metrical and temporal structure, and our ability to synchronize and entrain to such musical stimuli.
Log in om toegang te krijgen
Met onderstaand(e) abonnement(en) heeft u direct toegang:
Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society Series B (Methodological), 57, 289–300.
Brown, S., Merker, B., & Wallin, N. L. (2000). An introduction to evolutionary musicology. In N. L. Wallin, B. Merker, & S. Brown (Eds.), The origins of music (pp. 3–24). Cambridge: MIT Press.
Burger, B., Ahokas, R., Keipi, A., & Toiviainen, P. (2013a). Relationships between spectral flux, perceived rhythmic strength, and the propensity to move. In R. Bresin (Ed.), Proceedings of the 10th sound and music computing conference (pp. 179–184). Stockholm, Sweden: KTH Royal Institute of Technology.
Burger, B., & Toiviainen, P. (2013). MoCap Toolbox—a Matlab toolbox for computational analysis of movement data. In R. Bresin (Ed.), Proceedings of the 10th sound and music computing conference (pp. 172–178). Stockholm, Sweden: KTH Royal Institute of Technology.
Chen, J. L., Penhune, V. B., & Zatorre, R. J. (2009). The role of auditory and pre- motor cortex in sensorimotor transformations. The Neurosciences and Music III Disorders and Plasticity: Annals of the New York Academy of Science, 1169, 15–34.
Drake, C., Gros, L., & Penel, A. (1999). How fast is that music? The relation between physical and perceived tempo. In S. W. Yi (Ed.), Music, mind, and science (pp. 190–203). Seoul: Seoul National University Press.
Dunbar, R. (2012). On the evolutionary function of song and dance. In N. Bannan (Ed.), Music, language, and human evolution (pp. 201–214). Oxford: Oxford University Press. CrossRef
Eerola, T., Luck, G., & Toiviainen, P. (2006). An investigation of pre-schoolers’ corporeal synchronization with music. In M. Baroni, A. R. Addessi, R. Caterina, & M. Costa (Eds.), Proceedings of the 9th international conference on music perception and cognition (pp. 472–476). Bologna: University of Bologna.
Godøy, R. I., Haga, E., & Jensenius, A. R. (2006). Playing “air instruments”: Mimicry of sound-producing gestures by novices and experts. In S. Gibet, N. Courty, & J.-F. Kamp (Eds.), Gesture in human–computer interaction and simulation (Vol. 3881, pp. 256–267)., Lecture notes in computer science Berlin: Springer. CrossRef
Hove, M. J., Marie, C., Bruce, I. C., & Trainor, L. J. (2014). Superior time perception for lower musical pitch explains why bass-ranged instruments lay down musical rhythms. Proceedings of the National Academy of Sciences, 111, 10383–10388. CrossRef
Keller, P. E., & Repp, B. H. (2004). When two limbs are weaker than one: Sensorimotor syncopation with alternating hands. Journal of Experimental Psychology, 57, 1085–1101. doi: 10.1080/02724980343000693.
Lartillot, O., & Toiviainen, P. (2007). A Matlab toolbox for musical feature extraction from audio. Proceedings of the 10th international conference on digital audio effects (pp. 1–8). Bordeaux: University of Bordeaux.
Leman, M. (2007). Embodied music cognition and mediation technology. Cambridge: MIT Press.
Leman, M., & Godøy, R. I. (2010). Why study musical gesture? In R. I. Godøy & M. Leman (Eds.), Musical gestures. Sound, movement, and meaning (pp. 3–11). New York: Routledge.
Lesaffre, M., De Voogdt, L., Leman, M., De Baets, B., De Meyer, H., & Martens, J.-P. (2008). How potential users of music search and retrieval systems describe the semantic quality of music. Journal of the American Society for Information Science and Technology, 59, 695–707. doi: 10.1002/asi.20731. CrossRef
London, J. (2000). Rhythm. Grove Music Online. Oxford Music Online. Oxford University Press. http://www.oxfordmusiconline.com/subscriber/article/grove/music/45963. Accessed 20 Apr 2017.
London, J. (2011). Tactus ≠ tempo: Some dissociations between attentional focus, motor behavior, and tempo judgment. Empirical Musicology Review, 6, 43–55. CrossRef
Madison, G., Gouyon, F., Ullén, F., & Hörnström, K. (2011). Modeling the tendency for music to induce movement in humans: First correlations with low-level audio descriptors across music genres. Journal of Experimental Psychology: Human Perception and Performance, 37, 1578–1594. PubMed
Moelants, D. (2002). Preferred tempo reconsidered. In C. Stevens, D. Burnham, G. McPherson, E. Schubert, & J. Renwick (Eds.), Proceedings of the 7th international conference on music perception and cognition, Sydney, Australia (pp. 580–583). Adelaide, Australia: Causal Productions.
Naveda, L., & Leman, M. (2010). The spatiotemporal representation of dance and music gestures using topological gesture analysis (TGA). Music Perception, 28, 93–112. CrossRef
Parncutt, R. (1994). A perceptual model of pulse salience and metrical accent in musical rhythms. Music Perception, 11, 409–464. CrossRef
Patel, A. D., & Iversen, J. R. (2014). The evolutionary neuroscience of musical beat perception: The action simulation for auditory prediction (ASAP) hypothesis. Frontiers in Systematic Neuroscience, 8, 57. doi: 10.3389/fnsys.2014.00057.
Stupacher, J., Hove, M. J., & Janata, P. (2016). Audio features underlying perceived groove and sensorimotor synchronization in music. Music Perception, 33, 571–589. CrossRef
Tass, P., Rosenblum, M. G., Weule, J., Kurths, J., Pikovsky, A., Volkmann, J., … Freund, H.-J. (1998). Detection of n:m phase locking from noisy data: Application to magnetoencephalography. Physical Review Letters, 81, 3291–3294. CrossRef
Todd, N. P. M., O’Boyle, D. J., & Lee, C. S. (1999). A sensory-motor theory of rhythm, time perception and beat induction. Journal of New Music Research, 28, 5–28. CrossRef
Varela, F. J., Thompson, E., & Rosch, E. (1991). The embodied mind: Cognitive science and human experience. Cambridge: MIT Press.
- Synchronization to metrical levels in music depends on low-frequency spectral components and tempo
Marc R. Thompson
- Springer Berlin Heidelberg