dc.creator | Cerqueira Jes J.F. | |
dc.creator | Palhares Alvaro G.B. | |
dc.creator | Madrid Marconi K. | |
dc.date | 2000 | |
dc.date | 2015-06-30T19:52:04Z | |
dc.date | 2015-11-26T14:47:31Z | |
dc.date | 2015-06-30T19:52:04Z | |
dc.date | 2015-11-26T14:47:31Z | |
dc.date.accessioned | 2018-03-28T21:57:59Z | |
dc.date.available | 2018-03-28T21:57:59Z | |
dc.identifier | | |
dc.identifier | Proceedings Of The International Joint Conference On Neural Networks. Ieee, Piscataway, Nj, United States, v. 4, n. , p. 517 - 522, 2000. | |
dc.identifier | | |
dc.identifier | | |
dc.identifier | http://www.scopus.com/inward/record.url?eid=2-s2.0-0033698502&partnerID=40&md5=001ff17d9791e2f571d3c5083ae8c7fc | |
dc.identifier | http://www.repositorio.unicamp.br/handle/REPOSIP/107392 | |
dc.identifier | http://repositorio.unicamp.br/jspui/handle/REPOSIP/107392 | |
dc.identifier | 2-s2.0-0033698502 | |
dc.identifier.uri | http://repositorioslatinoamericanos.uchile.cl/handle/2250/1253311 | |
dc.description | A convergence analysis for learning algorithms based on gradient optimization methods was made and applied to the backpropagation algorithm. Made using Lyapunov's second method, the analysis supplies an upper bound for the learning rate of the back-propagation algorithm. This upper bound is useful for finding solutions for the parameter adjustment for the backpropagation algorithm. The convergence is solved via empirical methods. The solution presented is based on the well knowledge stability criterion for nonlinear systems. | |
dc.description | 4 | |
dc.description | | |
dc.description | 517 | |
dc.description | 522 | |
dc.language | en | |
dc.publisher | IEEE, Piscataway, NJ, United States | |
dc.relation | Proceedings of the International Joint Conference on Neural Networks | |
dc.rights | fechado | |
dc.source | Scopus | |
dc.title | Complement To The Back-propagation Algorithm: An Upper Bound For The Learning Rate | |
dc.type | Actas de congresos | |