dc.creatorFaleiros, Thiago de Paulo
dc.creatorLopes, Alneu de Andrade
dc.date.accessioned2016-10-20T19:05:11Z
dc.date.accessioned2018-07-04T17:12:00Z
dc.date.available2016-10-20T19:05:11Z
dc.date.available2018-07-04T17:12:00Z
dc.date.created2016-10-20T19:05:11Z
dc.date.issued2016-04
dc.identifierEuropean Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, XXIV, 2016, Bruges.
dc.identifier9782875870278
dc.identifierhttp://www.producao.usp.br/handle/BDPI/51059
dc.identifierhttps://www.elen.ucl.ac.be/Proceedings/esann/esannpdf/es2016-162.pdf
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/1646016
dc.description.abstractLDA (Latent Dirichlet Allocation ) and NMF (Non-negative Matrix Factorization) are two popular techniques to extract topics in a textual document corpus. This paper shows that NMF with Kullback-Leibler divergence approximate the LDA model under a uniform Dirichlet prior, therefore the comparative analysis can be useful to elucidate the implementation of variational inference algorithm for LDA.
dc.languageeng
dc.publisherEuropean Neural Network Society - ENNS
dc.publisherBruges
dc.relationEuropean Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, XXIV
dc.rightsclosedAccess
dc.titleOn the equivalence between algorithms for non-negative matrix factorization and latent Dirichlet allocation
dc.typeActas de congresos


Este ítem pertenece a la siguiente institución