COMPARISON OF OPTIMIZATION ALGORITHMS OF CONNECTIONIST TEMPORAL CLASSIFIER FOR SPEECH RECOGNITION SYSTEM

This paper evaluates and compares the performances of three well-known optimization algorithms (Adagrad, Adam, Momentum) for faster training the neural network of CTC algorithm for speech recognition. For CTC algorithms recurrent neural network has been used, specifically Long-Short-Term memory. LST...

Full description

Bibliographic Details
Main Authors: Yedilkhan Amirgaliyev, Kuanyshbay Kuanyshbay, Aisultan Shoiynbek
Format: Article
Language:English
Published: Lublin University of Technology 2019-09-01
Series:Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska
Subjects:
Online Access:https://ph.pollub.pl/index.php/iapgos/article/view/234
Description
Summary:This paper evaluates and compares the performances of three well-known optimization algorithms (Adagrad, Adam, Momentum) for faster training the neural network of CTC algorithm for speech recognition. For CTC algorithms recurrent neural network has been used, specifically Long-Short-Term memory. LSTM is effective and often used model. Data has been downloaded from VCTK corpus of Edinburgh University. The results of optimization algorithms have been evaluated by the Label error rate and CTC loss.
ISSN:2083-0157
2391-6761