dc.creatorMartinez, JM
dc.date1998
dc.dateFEB
dc.date2014-12-02T16:28:53Z
dc.date2015-11-26T16:37:58Z
dc.date2014-12-02T16:28:53Z
dc.date2015-11-26T16:37:58Z
dc.date.accessioned2018-03-28T23:21:12Z
dc.date.available2018-03-28T23:21:12Z
dc.identifierJournal Of Optimization Theory And Applications. Plenum Publ Corp, v. 96, n. 2, n. 397, n. 436, 1998.
dc.identifier0022-3239
dc.identifierWOS:000072566900008
dc.identifier10.1023/A:1022626332710
dc.identifierhttp://www.repositorio.unicamp.br/jspui/handle/REPOSIP/76833
dc.identifierhttp://www.repositorio.unicamp.br/handle/REPOSIP/76833
dc.identifierhttp://repositorio.unicamp.br/jspui/handle/REPOSIP/76833
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/1272169
dc.descriptionThe family of feasible methods for minimization with nonlinear constraints includes the nonlinear projected gradient method, the generalized reduced gradient method (GRG), and many variants of the sequential gradient restoration algorithm (SGRA). Generally speaking, a particular iteration of any of these methods proceeds in two phases. In the restoration phase, feasibility is restored by means of the resolution of an auxiliary nonlinear problem, generally a nonlinear system of equations. In the minimization phase, optimality is improved by means of the consideration of the objective function, or its Lagrangian, on the tangent subspace to the constraints. In this paper, minimal assumptions are stated on the restoration phase and the minimization phase that ensure that the resulting algorithm is globally convergent. The key point is the possibility of comparing two successive nonfeasible iterates by means of a suitable merit function that combines feasibility and optimality. The merit function allows one to work with a high degree of infeasibility at the first iterations of the algorithm. Global convergence is proved and a particular implementation of the model algorithm is described.
dc.description96
dc.description2
dc.description397
dc.description436
dc.languageen
dc.publisherPlenum Publ Corp
dc.publisherNew York
dc.publisherEUA
dc.relationJournal Of Optimization Theory And Applications
dc.relationJ. Optim. Theory Appl.
dc.rightsfechado
dc.sourceWeb of Science
dc.subjectnonlinear programming
dc.subjecttrust regions
dc.subjectGRG methods
dc.subjectSGRA methods
dc.subjectprojected gradient methods
dc.subjectSQP methods
dc.subjectglobal convergence
dc.subjectTrust-region Algorithm
dc.subjectConstrained Optimization
dc.subjectMinimization
dc.subjectSgra
dc.titleTwo-phase model algorithm with global convergence for nonlinear programming
dc.typeArtículos de revistas


Este ítem pertenece a la siguiente institución