A fast and simple algorithm for training neural probabilistic language models
In spite of their superior performance, neural probabilistic language models (NPLMs) remain far less widely used than n-gram models due to their notoriously long training times, which are measured in weeks even for moderately-sized datasets. Training NPLMs is computationally expensive because they a...
المؤلفون الرئيسيون: | , |
---|---|
التنسيق: | Journal article |
اللغة: | English |
منشور في: |
2012
|