A fast and simple algorithm for training neural probabilistic language models
In spite of their superior performance, neural probabilistic language models (NPLMs) remain far less widely used than n-gram models due to their notoriously long training times, which are measured in weeks even for moderately-sized datasets. Training NPLMs is computationally expensive because they a...
Үндсэн зохиолчид: | Mnih, A, Teh, Y |
---|---|
Формат: | Journal article |
Хэл сонгох: | English |
Хэвлэсэн: |
2012
|
Ижил төстэй зүйлс
Ижил төстэй зүйлс
-
Language acquisition and probabilistic models: Keeping it simple
-н: Villavicencio, Aline, зэрэг
Хэвлэсэн: (2022) -
Language acquisition and probabilistic models: Keeping it simple
Хэвлэсэн: (2021) -
Effects of Fast Simple Numerical Calculation Training on Neural Systems
-н: Hikaru Takeuchi, зэрэг
Хэвлэсэн: (2016-01-01) -
The concrete distribution: A continuous relaxation of discrete random variables
-н: Maddison, C, зэрэг
Хэвлэсэн: (2017) -
Fast Training Algorithms for Feed Forward Neural Networks
-н: Luma N. M. Tawfiq, зэрэг
Хэвлэсэн: (2017-04-01)