dc.date.accessioned2016-12-27T21:50:50Z
dc.date.accessioned2018-06-13T23:06:17Z
dc.date.available2016-12-27T21:50:50Z
dc.date.available2018-06-13T23:06:17Z
dc.date.created2016-12-27T21:50:50Z
dc.date.issued2006
dc.identifier978-3-540-49058-6
dc.identifier978-3-540-49026-5
dc.identifierhttp://hdl.handle.net/10533/165687
dc.identifier1040208
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/1544489
dc.description.abstractThis work addresses an important problem in Feedforward Neural Networks (FNN) training, i.e. finding the pseudo-global minimum of the cost function, assuring good generalization properties to the trained architecture. Firstly, pseudo-global optimization is achieved by employing a combined parametric updating algorithm which is supported by the transformation of network parameters into interval numbers. It solves the network weight initialization problem, performing an exhaustive search for minimums by means of Interval Arithmetic (IA). Then, the global minimum is obtained once the search has been limited to the region of convergence (ROC). IA allows representing variables and parameters as compact-closed sets, then, a training procedure using interval weights can be done. The methodology developed is exemplified by an approximation of a known non-linear function in last section.
dc.languageeng
dc.publisherSPRINGER-VERLAG BERLIN
dc.relationLECTURE NOTES IN COMPUTES SCIENCE
dc.relationhttp://www.springer.com/us/book/9783540490265
dc.relation10.1007/11925231
dc.relationinfo:eu-repo/grantAgreement/Fondecyt/1040208
dc.relationinfo:eu-repo/semantics/dataset/hdl.handle.net/10533/93479
dc.relationinstname: Conicyt
dc.relationreponame: Repositorio Digital RI2.0
dc.relationinstname: Conicyt
dc.relationreponame: Repositorio Digital RI 2.0
dc.rightsinfo:eu-repo/semantics/openAccess
dc.titleAN INTERVAL APPROACH FOR WEIGHTS INITIALIZATION OF FEEDFORWARD NEURAL NETWORKS
dc.typeCapitulo de libro


Este ítem pertenece a la siguiente institución