Exponential language modeling using morphological features and multi-task learning

For languages with fast vocabulary growth and limited resources, data sparsity leads to challenges in training a language model. One strategy for addressing this problem is to leverage morphological structure as features in the model. This paper explores different uses of unsupervised morphological...

Full description

Bibliographic Details
Main Authors: Fang, H, Ostendorf, M, Baumann, P, Pierrehumbert, J
Format: Journal article
Published: Institute of Electrical and Electronics Engineers 2015
Description
Summary:For languages with fast vocabulary growth and limited resources, data sparsity leads to challenges in training a language model. One strategy for addressing this problem is to leverage morphological structure as features in the model. This paper explores different uses of unsupervised morphological features in both the history and prediction space for three word-based exponential models (maximum entropy, logbilinear, and recurrent neural net (RNN)). Multi-task training is introduced as a regularizing mechanism to improve performance in the continuous-space approaches. The models are compared to non-parametric baselines. From using the RNN with morphological features and multi-task learning, experiments with conversational speech from four languages show we can obtain consistent gains of 7-11% in perplexity reduction in a limited-resource scenario (10 hrs speech), and 12-18% when the training size is increased ( 80 hrs ). Results are mixed for all other approaches, compared to a modified Kneser-Ney baseline, but morphology is useful in continuous-space models compared to their word-only baseline. Multi-task learning improves both continuous-space models.