A fast and simple algorithm for training neural probabilistic language models

In spite of their superior performance, neural probabilistic language models (NPLMs) remain far less widely used than n-gram models due to their notoriously long training times, which are measured in weeks even for moderately-sized datasets. Training NPLMs is computationally expensive because they a...

詳細記述

書誌詳細
主要な著者: Mnih, A, Teh, Y
フォーマット: Journal article
言語:English
出版事項: 2012