dc.contributor0000-0002-7337-8974
dc.contributorhttps://orcid.org/0000-0002-7337-8974
dc.contributorhttps://orcid.org/0000-0002-8060-6170
dc.creatorBecerra, Aldonso
dc.creatorDe la Rosa Vargas, José Ismael
dc.creatorGonzález Ramírez, Efrén
dc.creatorPedroza, David
dc.creatorEscalante, N. Iracemi
dc.date.accessioned2020-04-16T19:13:20Z
dc.date.available2020-04-16T19:13:20Z
dc.date.created2020-04-16T19:13:20Z
dc.date.issued2018-10
dc.identifier1380-7501
dc.identifier1573-7721
dc.identifierhttp://ricaxcan.uaz.edu.mx/jspui/handle/20.500.11845/1714
dc.identifierhttps://doi.org/10.48779/mz95-hr57
dc.description.abstractThe aim of this paper is to exhibit two new variations of the frame-level cost function for training a deep neural network in order to achieve better word error rates in speech recognition. Optimization methods and their minimization functions are underlying aspects to consider when someone is working on neural nets, and hence their improvement is one of the salient objectives of researchers, and this paper deals in part with such a situation. The first proposed framework is based on the concept of extropy, the complementary dual function of an uncertainty measure. The conventional cross entropy function can be mapped to a non-uniform loss function based on its corresponding extropy, enhancing the frames that have ambiguity in their belonging to specific senones. The second proposal makes a fusion of the presented mapped cross-entropy function and the idea of boosted cross-entropy, which emphasizes those frames with low target posterior probability.
dc.languageeng
dc.publisherSpringer
dc.relationgeneralPublic
dc.relationhttps://doi.org/10.1007/s11042- 018-5917-5
dc.rightshttp://creativecommons.org/licenses/by-nc-nd/3.0/us/
dc.rightsAtribución-NoComercial-SinDerivadas 3.0 Estados Unidos de América
dc.sourceMultimedia Tools Applications, Vol. 77, No. 20, pp. 27231-27267
dc.titleTraining deep neural networks with non-uniform frame-level cost function for automatic speech recognition
dc.typeinfo:eu-repo/semantics/article


Este ítem pertenece a la siguiente institución