On the Derivational Entropy of Left-to-Right Probabilistic Finite-State Automata and Hidden Markov Models
Probabilistic finite-state automata are a formalism that is widely used in many problems of automatic speech recognition and natural language processing. Probabilistic finite-state automata are closely related to other finite-state models as weighted finite-state automata, word lattices, and hidden...
Main Authors: | Joan Andreu Sánchez, Martha Alicia Rocha, Verónica Romero, Mauricio Villegas |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2017-12-01
|
Series: | Computational Linguistics |
Online Access: | http://dx.doi.org/10.1162/coli_a_00306 |
Similar Items
-
Probabilistic Independence Networks for Hidden Markov Probability Models
by: Smyth, Padhraic, et al.
Published: (2004) -
Succinctness of two-way probabilistic and quantum finite automata
by: Abuzer Yakaryilmaz, et al.
Published: (2010-01-01) -
Probabilistic Deterministic Finite Automata and Recurrent Networks, Revisited
by: Sarah E. Marzen, et al.
Published: (2022-01-01) -
Analyticity of entropy rates of continuous-state hidden Markov models
by: Tadic, V, et al.
Published: (2019) -
Revolutionizing load harmony in edge computing networks with probabilistic cellular automata and Markov decision processes
by: Dinesh Sahu, et al.
Published: (2025-01-01)