Probabilistic Independence Networks for Hidden Markov Probability Models
Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been dev...
Hauptverfasser: | , , |
---|---|
Sprache: | en_US |
Veröffentlicht: |
2004
|
Schlagworte: | |
Online Zugang: | http://hdl.handle.net/1721.1/7185 |
_version_ | 1826199337448767488 |
---|---|
author | Smyth, Padhraic Heckerman, David Jordan, Michael |
author_facet | Smyth, Padhraic Heckerman, David Jordan, Michael |
author_sort | Smyth, Padhraic |
collection | MIT |
description | Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach. |
first_indexed | 2024-09-23T11:18:36Z |
id | mit-1721.1/7185 |
institution | Massachusetts Institute of Technology |
language | en_US |
last_indexed | 2024-09-23T11:18:36Z |
publishDate | 2004 |
record_format | dspace |
spelling | mit-1721.1/71852019-04-12T08:34:02Z Probabilistic Independence Networks for Hidden Markov Probability Models Smyth, Padhraic Heckerman, David Jordan, Michael AI MIT Artificial Intelligence graphical models Hidden Markov models HMM's learning probabilistic models speech recognition Bayesian networks belief networks Markov networks probabilistic propagation inference coarticulation Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach. 2004-10-20T20:49:09Z 2004-10-20T20:49:09Z 1996-03-13 AIM-1565 CBCL-132 http://hdl.handle.net/1721.1/7185 en_US AIM-1565 CBCL-132 31 p. 664995 bytes 687871 bytes application/postscript application/pdf application/postscript application/pdf |
spellingShingle | AI MIT Artificial Intelligence graphical models Hidden Markov models HMM's learning probabilistic models speech recognition Bayesian networks belief networks Markov networks probabilistic propagation inference coarticulation Smyth, Padhraic Heckerman, David Jordan, Michael Probabilistic Independence Networks for Hidden Markov Probability Models |
title | Probabilistic Independence Networks for Hidden Markov Probability Models |
title_full | Probabilistic Independence Networks for Hidden Markov Probability Models |
title_fullStr | Probabilistic Independence Networks for Hidden Markov Probability Models |
title_full_unstemmed | Probabilistic Independence Networks for Hidden Markov Probability Models |
title_short | Probabilistic Independence Networks for Hidden Markov Probability Models |
title_sort | probabilistic independence networks for hidden markov probability models |
topic | AI MIT Artificial Intelligence graphical models Hidden Markov models HMM's learning probabilistic models speech recognition Bayesian networks belief networks Markov networks probabilistic propagation inference coarticulation |
url | http://hdl.handle.net/1721.1/7185 |
work_keys_str_mv | AT smythpadhraic probabilisticindependencenetworksforhiddenmarkovprobabilitymodels AT heckermandavid probabilisticindependencenetworksforhiddenmarkovprobabilitymodels AT jordanmichael probabilisticindependencenetworksforhiddenmarkovprobabilitymodels |