Regularization, robustness and sparsity of probabilistic topic models

We propose a generalized probabilistic topic model of text corpora which can incorporate heuristics of Bayesian regularization, sampling, frequent parameters update, and robustness in any combinations. Wellknown models PLSA, LDA, CVB0, SWB, and many others can be considered as special cases of the p...

Full description

Bibliographic Details
Main Authors: Konstantin Vyacheslavovich Vorontsov, Anna Alexandrovna Potapenko
Format: Article
Language:Russian
Published: Institute of Computer Science 2012-12-01
Series:Компьютерные исследования и моделирование
Subjects:
Online Access:http://crm.ics.org.ru/uploads/crmissues/crm_2012_4/12403.pdf
Description
Summary:We propose a generalized probabilistic topic model of text corpora which can incorporate heuristics of Bayesian regularization, sampling, frequent parameters update, and robustness in any combinations. Wellknown models PLSA, LDA, CVB0, SWB, and many others can be considered as special cases of the proposed broad family of models. We propose the robust PLSA model and show that it is more sparse and performs better that regularized models like LDA.
ISSN:2076-7633
2077-6853