Artigo
Neural approach for solving several types of optimization problems
Fecha
2006-03-01Registro en:
Journal of Optimization Theory and Applications. New York: Springer/plenum Publishers, v. 128, n. 3, p. 563-580, 2006.
0022-3239
10.1007/s10957-006-9032-9
WOS:000241554100005
Autor
Universidade Estadual Paulista (Unesp)
Universidade Estadual de Campinas (UNICAMP)
Fed Ctr Educ Technol
Resumen
Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural net-works that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its inter-nal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimiza-tion problems, dynamic programming problems, and nonlinear optimization problems.