Memorized Variational Continual Learning for Dirichlet Process Mixtures

Bayesian nonparametric models are theoretically suitable for streaming data due to their ability to adapt model complexity with the observed data. However, very limited work has addressed posterior inference in a streaming fashion, and most of the existing variational inference algorithms require tr...

Full description

Bibliographic Details
Main Authors: Yang Yang, Bo Chen, Hongwei Liu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8871157/
Description
Summary:Bayesian nonparametric models are theoretically suitable for streaming data due to their ability to adapt model complexity with the observed data. However, very limited work has addressed posterior inference in a streaming fashion, and most of the existing variational inference algorithms require truncation on variational distributions which cannot vary with the data. In this paper, we focus Dirichlet process mixture models and develop the corresponding variational continual learning approach by maintaining memorized sufficient statistics for previous tasks, called memorized variational continual learning (MVCL), which is able to handle both the posterior update and data in a continual learning setting. Furthermore, we extend MVCL for two cases of mixture models which can handle different data types. The experiments demonstrate the comparable inference capability of our MVCL for both discrete and real-valued datasets with automatically inferring the number of mixture components.
ISSN:2169-3536