Experimental Comparison of Transformers and Reformers for Text Classification
In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IFSA Publishing, S.L.
2021-02-01
|
Series: | Sensors & Transducers |
Subjects: | |
Online Access: | https://sensorsportal.com/HTML/DIGEST/february_2021/Vol_249/P_3212.pdf |
Summary: | In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with four different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformers in terms of accuracy and training speed for text classification. However, Reformers allow the training of bigger models which would otherwise cause memory failures with Transformers. |
---|---|
ISSN: | 2306-8515 1726-5479 |