dc.contributorUniversidade Estadual Paulista (UNESP)
dc.contributorSão Carlos Federal University
dc.date.accessioned2022-04-30T23:49:51Z
dc.date.accessioned2022-12-20T03:35:15Z
dc.date.available2022-04-30T23:49:51Z
dc.date.available2022-12-20T03:35:15Z
dc.date.created2022-04-30T23:49:51Z
dc.date.issued2020-01-01
dc.identifierNatural Computing Series, p. 67-96.
dc.identifier1619-7127
dc.identifierhttp://hdl.handle.net/11449/233002
dc.identifier10.1007/978-981-15-3685-4_3
dc.identifier2-s2.0-85086100220
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/5413101
dc.description.abstractMachine learning techniques are capable of talking, interpreting, creating, and even reasoning about virtually any subject. Also, their learning power has grown exponentially throughout the last years due to advances in hardware architecture. Nevertheless, most of these models still struggle regarding their practical usage since they require a proper selection of hyper-parameters, which are often empirically chosen. Such requirements are strengthened when concerning deep learning models, which commonly require a higher number of hyper-parameters. A collection of nature-inspired optimization techniques, known as meta-heuristics, arise as straightforward solutions to tackle such problems since they do not employ derivatives, thus alleviating their computational burden. Therefore, this work proposes a comparison among several meta-heuristic optimization techniques in the context of Deep Belief Networks hyper-parameter fine-tuning. An experimental setup was conducted over three public datasets in the task of binary image reconstruction and demonstrated consistent results, posing meta-heuristic techniques as a suitable alternative to the problem.
dc.languageeng
dc.relationNatural Computing Series
dc.sourceScopus
dc.titleOn the Assessment of Nature-Inspired Meta-Heuristic Optimization Techniques to Fine-Tune Deep Belief Networks
dc.typeCapítulos de libros


Este ítem pertenece a la siguiente institución