dc.contributorUniversidade Estadual Paulista (Unesp)
dc.contributorMiddlesex Univ
dc.date.accessioned2018-11-26T17:55:04Z
dc.date.available2018-11-26T17:55:04Z
dc.date.created2018-11-26T17:55:04Z
dc.date.issued2018-09-01
dc.identifierSoft Computing. New York: Springer, v. 22, n. 18, p. 6147-6156, 2018.
dc.identifier1432-7643
dc.identifierhttp://hdl.handle.net/11449/164566
dc.identifier10.1007/s00500-017-2678-4
dc.identifierWOS:000442576400018
dc.identifierWOS000442576400018.pdf
dc.description.abstractDeep learning-based approaches have been paramount in recent years, mainly due to their outstanding results in several application domains, ranging from face and object recognition to handwritten digit identification. Convolutional neural networks (CNNs) have attracted a considerable attention since they model the intrinsic and complex brain working mechanisms. However, one main shortcoming of such models concerns their overfitting problem, which prevents the network from predicting unseen data effectively. In this paper, we address this problem by means of properly selecting a regularization parameter known as dropout in the context of CNNs using meta-heuristic-driven techniques. As far as we know, this is the first attempt to tackle this issue using this methodology. Additionally, we also take into account a default dropout parameter and a dropout-less CNN for comparison purposes. The results revealed that optimizing dropout-based CNNs is worthwhile, mainly due to the easiness in finding suitable dropout probability values, without needing to set new parameters empirically.
dc.languageeng
dc.publisherSpringer
dc.relationSoft Computing
dc.relation0,593
dc.rightsAcesso aberto
dc.sourceWeb of Science
dc.subjectConvolutional neural networks
dc.subjectDropout
dc.subjectMeta-heuristic optimization
dc.titleHandling dropout probability estimation in convolution neural networks using meta-heuristics
dc.typeArtículos de revistas


Este ítem pertenece a la siguiente institución