dc.date.accessioned2019-08-23T17:01:42Z
dc.date.accessioned2022-09-23T13:55:42Z
dc.date.available2019-08-23T17:01:42Z
dc.date.available2022-09-23T13:55:42Z
dc.date.created2019-08-23T17:01:42Z
dc.identifierhttps://link.springer.com/article/10.1007/s00221-016-4747-9
dc.identifierhttp://hdl.handle.net/10818/36815
dc.identifier10.1007/s00221-016-4747-9
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/3472541
dc.description.abstractResearch on the crossmodal correspondences has revealed that seemingly unrelated perceptual information can be matched across the senses in a manner that is consistent across individuals. An interesting extension of this line of research is to study how sensory information biases action. In the present study, we investigated whether different sounds (i.e. tones and piano chords) would bias participants’ hand movements in a free movement task. Right-handed participants were instructed to move a computer mouse in order to represent three tones and two chords. They also had to rate each sound in terms of three visual analogue scales (slow–fast, unpleasant–pleasant, and weak–strong). The results demonstrate that tones and chords influence hand movements, with higher-(lower-)pitched sounds giving rise to a significant bias towards upper (lower) locations in space. These results are discussed in terms of the literature on forward models, embodied cognition, crossmodal correspondences, and mental imagery. Potential applications sports and rehabilitation are discussed briefly.
dc.languageeng
dc.publisherExperimental Brain Research
dc.relationExperimental Brain Research 2016, Volume 234, Issue 12, pp 3509–3522
dc.rightshttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rightsopenAccess
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International
dc.subjectSound
dc.subjectSpace
dc.subjectMouse-tracking
dc.subjectMovement
dc.subjectValence
dc.subjectEmbodied cognition
dc.subjectCrossmodal correspondences
dc.titleDrawing sounds: representing tones and chords spatially
dc.typejournal article


Este ítem pertenece a la siguiente institución