Minimal Realization Problems for Hidden Markov Models

This paper addresses two fundamental problems in the context of hidden Markov models (HMMs). The first problem is concerned with the characterization and computation of a minimal order HMM that realizes the exact joint densities of an output process based on only finite strings of such densities (kn...

Full description

Bibliographic Details
Main Authors: Ge, Rong, Kakade, Sham, Huang, Qingqing, Dahleh, Munther A
Other Authors: Massachusetts Institute of Technology. Institute for Data, Systems, and Society
Format: Article
Language:en_US
Published: Institute of Electrical and Electronics Engineers (IEEE) 2017
Online Access:http://hdl.handle.net/1721.1/110794
https://orcid.org/0000-0002-9113-7269
https://orcid.org/0000-0002-1470-2148
Description
Summary:This paper addresses two fundamental problems in the context of hidden Markov models (HMMs). The first problem is concerned with the characterization and computation of a minimal order HMM that realizes the exact joint densities of an output process based on only finite strings of such densities (known as HMM partial realization problem). The second problem is concerned with learning a HMM from finite output observations of a stochastic process. We review and connect two fields of studies: realization theory of HMMs, and the recent development in spectral methods for learning latent variable models. Our main results in this paper focus on generic situations, namely, statements that will be true for almost all HMMs, excluding a measure zero set in the parameter space. In the main theorem, we show that both the minimal quasi-HMM realization and the minimal HMM realization can be efficiently computed based on the joint probabilities of length N strings, for N in the order of O(logd(k)). In other words, learning a quasi-HMM and an HMM have comparable complexity for almost all HMMs.