dc.creatorBoari, Santiago
dc.creatorYonatan Sanz Perl
dc.creatorAmador, Ana
dc.creatorMargoliash, Daniel
dc.creatorMindlin, Bernardo Gabriel
dc.date.accessioned2018-06-08T17:27:11Z
dc.date.accessioned2018-11-06T11:26:19Z
dc.date.available2018-06-08T17:27:11Z
dc.date.available2018-11-06T11:26:19Z
dc.date.created2018-06-08T17:27:11Z
dc.date.issued2015-09
dc.identifierBoari, Santiago; Yonatan Sanz Perl; Amador, Ana; Margoliash, Daniel; Mindlin, Bernardo Gabriel; Automatic reconstruction of physiological gestures used in a model of birdsong production; American Physiological Society; Journal of Neurophysiology; 114; 5; 9-2015; 2912-2922
dc.identifier0022-3077
dc.identifierhttp://hdl.handle.net/11336/47891
dc.identifierCONICET Digital
dc.identifierCONICET
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/1851699
dc.description.abstractHighly coordinated learned behaviors are key to understanding neural processes integrating the body and the environment. Birdsong production is a widely studied example of such behavior in which numerous thoracic muscles control respiratory inspiration and expiration: the muscles of the syrinx control syringeal membrane tension, while upper vocal tract morphology controls resonances that modulate the vocal system output. All these muscles have to be coordinated in precise sequences to generate the elaborate vocalizations that characterize an individual´s song. Previously we used a low-dimensional description of the biomechanics of birdsong production to investigate the associated neural codes, an approach that complements traditional spectrographic analysis. The prior study used algorithmic yet manual procedures to model singing behavior. In the present work, we present an automatic procedure to extract low-dimensional motor gestures that could predict vocal behavior. We recorded zebra finch songs and generated synthetic copies automatically, using a biomechanical model for the vocal apparatus and vocal tract. This dynamical model described song as a sequence of physiological parameters the birds control during singing. To validate this procedure, we recorded electrophysiological activity of the telencephalic nucleus HVC. HVC neurons were highly selective to the auditory presentation of the bird´s own song (BOS) and gave similar selective responses to the automatically generated synthetic model of song (AUTO). Our results demonstrate meaningful dimensionality reduction in terms of physiological parameters that individual birds could actually control. Furthermore, this methodology can be extended to other vocal systems to study fine motor control.
dc.languageeng
dc.publisherAmerican Physiological Society
dc.relationinfo:eu-repo/semantics/altIdentifier/url/http://jn.physiology.org/content/114/5/2912
dc.relationinfo:eu-repo/semantics/altIdentifier/doi/http://dx.doi.org/10.1152/jn.00385.2015
dc.rightshttps://creativecommons.org/licenses/by-nc-sa/2.5/ar/
dc.rightsinfo:eu-repo/semantics/restrictedAccess
dc.subjectDYNAMICAL SYSTEMS
dc.subjectVOCAL LEARNING
dc.subjectBIRD'S OWN SONG
dc.subjectMODELING SOFTWARE
dc.titleAutomatic reconstruction of physiological gestures used in a model of birdsong production
dc.typeArtículos de revistas
dc.typeArtículos de revistas
dc.typeArtículos de revistas


Este ítem pertenece a la siguiente institución