Summary: | Bayesian nonparametric models are theoretically suitable for streaming data due to their ability to adapt model complexity with the observed data. However, very limited work has addressed posterior inference in a streaming fashion, and most of the existing variational inference algorithms require truncation on variational distributions which cannot vary with the data. In this paper, we focus Dirichlet process mixture models and develop the corresponding variational continual learning approach by maintaining memorized sufficient statistics for previous tasks, called memorized variational continual learning (MVCL), which is able to handle both the posterior update and data in a continual learning setting. Furthermore, we extend MVCL for two cases of mixture models which can handle different data types. The experiments demonstrate the comparable inference capability of our MVCL for both discrete and real-valued datasets with automatically inferring the number of mixture components.
|