Capitulo de libro
AN INTERVAL APPROACH FOR WEIGHTS INITIALIZATION OF FEEDFORWARD NEURAL NETWORKS
Fecha
2006Registro en:
978-3-540-49058-6
978-3-540-49026-5
1040208
Institución
Resumen
This work addresses an important problem in Feedforward Neural Networks (FNN) training, i.e. finding the pseudo-global minimum of the cost function, assuring good generalization properties to the trained architecture. Firstly, pseudo-global optimization is achieved by employing a combined parametric updating algorithm which is supported by the transformation of network parameters into interval numbers. It solves the network weight initialization problem, performing an exhaustive search for minimums by means of Interval Arithmetic (IA). Then, the global minimum is obtained once the search has been limited to the region of convergence (ROC). IA allows representing variables and parameters as compact-closed sets, then, a training procedure using interval weights can be done. The methodology developed is exemplified by an approximation of a known non-linear function in last section.