Two dissimilarity measures for HMMs and their application in phoneme model clustering
This paper introduces two approximations of the Kullback-Leibler divergence for hidden Markov models (HMMs). The first one is a generalization of an approximation originally presented for HMMs with discrete observation densities. In that case, the HMMs are assumed to be ergodic and the topologies si...
Main Authors: | , , , , |
---|---|
Format: | Conference item |
Published: |
2002
|