dc.creatorLing, Yongfa
dc.creatorGuan, Wenbo
dc.creatorRuan, Qiang
dc.creatorSong, Heping
dc.creatorLai, Yuping
dc.date.accessioned2022-10-24T11:57:46Z
dc.date.accessioned2023-03-07T19:39:12Z
dc.date.available2022-10-24T11:57:46Z
dc.date.available2023-03-07T19:39:12Z
dc.date.created2022-10-24T11:57:46Z
dc.identifier1989-1660
dc.identifierhttps://reunir.unir.net/handle/123456789/13710
dc.identifierhttps://doi.org/10.9781/ijimai.2022.08.006
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/5907966
dc.description.abstracthe finite invert Beta-Liouville mixture model (IBLMM) has recently gained some attention due to its positive data modeling capability. Under the conventional variational inference (VI) framework, the analytically tractable solution to the optimization of the variational posterior distribution cannot be obtained, since the variational object function involves evaluation of intractable moments. With the recently proposed extended variational inference (EVI) framework, a new function is proposed to replace the original variational object function in order to avoid intractable moment computation, so that the analytically tractable solution of the IBLMM can be derived in an effective way. The good performance of the proposed approach is demonstrated by experiments with both synthesized data and a real-world application namely text categorization.
dc.languageeng
dc.publisherInternational Journal of Interactive Multimedia and Artificial Intelligence (IJIMAI)
dc.relationhttps://www.ijimai.org/journal/bibcite/reference/3157
dc.rightsopenAccess
dc.subjectbayesian inference
dc.subjectextended variational inference
dc.subjectmixture model
dc.subjecttext categorization
dc.subjectinverted beta-liouville distribution
dc.subjectIJIMAI
dc.titleVariational Learning for the Inverted Beta-Liouville Mixture Model and Its Application to Text Categorization
dc.typearticle


Este ítem pertenece a la siguiente institución