dc.creatorRubiolo, Mariano
dc.date2012-08
dc.date2012
dc.date2021-08-30T15:29:38Z
dc.date.accessioned2023-07-15T03:01:13Z
dc.date.available2023-07-15T03:01:13Z
dc.identifierhttp://sedici.unlp.edu.ar/handle/10915/123741
dc.identifierhttps://41jaiio.sadio.org.ar/sites/default/files/14_ASAI_2012.pdf
dc.identifierissn:1850-2784
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/7464142
dc.descriptionWhen large models are used for a classification task, model compression is necessary because there are transmission, space, time or computing constraints that have to be fulfilled. Multilayer Perceptron (MLP) models are traditionally used as a classifier, but depending on the problem, they may need a large number of parameters (neuron functions, weights and bias) to obtain an acceptable performance. This work extends the evaluation of a technique to compress an array of MLPs, through the outputs of a Volterra-Neural Network (Volterra-NN), maintaining its classification performance. The obtained results show that these outputs can be used to build an array of (Volterra-NN) that needs significantly less parameters than the original array of MLPs, furthermore having the same high accuracy in most of the cases. The Volterra-NN compression capabilities have been tested by solving several kind of classification problems. Experimental results are presented on three well-known databases: Letter Recognition, Pen-Based Recognition of Handwritten Digits, and Face recognition databases.
dc.descriptionSociedad Argentina de Informática e Investigación Operativa
dc.formatapplication/pdf
dc.format152-164
dc.languageen
dc.rightshttp://creativecommons.org/licenses/by-nc-sa/4.0/
dc.rightsCreative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
dc.subjectCiencias Informáticas
dc.subjectModel compression
dc.subjectVolterra-Neural Network
dc.subjectExtended evaluation
dc.titleExtended evaluation of the Volterra-Neural Network for model compression
dc.typeObjeto de conferencia
dc.typeObjeto de conferencia


Este ítem pertenece a la siguiente institución