dc.contributorUniversidade Federal de São Carlos (UFSCar)
dc.contributorUniversidade Estadual Paulista (Unesp)
dc.date.accessioned2018-11-26T17:54:34Z
dc.date.available2018-11-26T17:54:34Z
dc.date.created2018-11-26T17:54:34Z
dc.date.issued2018-08-01
dc.identifierNeural Processing Letters. Dordrecht: Springer, v. 48, n. 1, p. 95-107, 2018.
dc.identifier1370-4621
dc.identifierhttp://hdl.handle.net/11449/164443
dc.identifier10.1007/s11063-017-9707-2
dc.identifierWOS:000439352200005
dc.identifierWOS000439352200005.pdf
dc.description.abstractDeep learning techniques have been paramount in the last years, mainly due to their outstanding results in a number of applications, that range from speech recognition to face-based user identification. Despite other techniques employed for such purposes, Deep Boltzmann Machines (DBMs) are among the most used ones, which are composed of layers of Restricted Boltzmann Machines stacked on top of each other. In this work, we evaluate the concept of temperature in DBMs, which play a key role in Boltzmann-related distributions, but it has never been considered in this context up to date. Therefore, the main contribution of this paper is to take into account this information, as well as the impact of replacing a standard Sigmoid function by another one and to evaluate their influence in DBMs considering the task of binary image reconstruction. We expect this work can foster future research considering the usage of different temperatures during learning in DBMs.
dc.languageeng
dc.publisherSpringer
dc.relationNeural Processing Letters
dc.relation0,510
dc.rightsAcesso aberto
dc.sourceWeb of Science
dc.subjectDeep Learning
dc.subjectDeep Boltzmann Machines
dc.subjectMachine learning
dc.titleTemperature-Based Deep Boltzmann Machines
dc.typeArtículos de revistas


Este ítem pertenece a la siguiente institución