Experimental Comparison of Transformers and Reformers for Text Classification

In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences...

Full description

Bibliographic Details
Main Authors: Roghayeh Soleymani, Julien Beaulieu, Jérémie Farret
Format: Article
Language:English
Published: IFSA Publishing, S.L. 2021-02-01
Series:Sensors & Transducers
Subjects:
Online Access:https://sensorsportal.com/HTML/DIGEST/february_2021/Vol_249/P_3212.pdf
_version_ 1797752217383469056
author Roghayeh Soleymani
Julien Beaulieu
Jérémie Farret
author_facet Roghayeh Soleymani
Julien Beaulieu
Jérémie Farret
author_sort Roghayeh Soleymani
collection DOAJ
description In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with four different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformers in terms of accuracy and training speed for text classification. However, Reformers allow the training of bigger models which would otherwise cause memory failures with Transformers.
first_indexed 2024-03-12T17:00:03Z
format Article
id doaj.art-b0042ff9800b44a8943f782e54bb1378
institution Directory Open Access Journal
issn 2306-8515
1726-5479
language English
last_indexed 2024-03-12T17:00:03Z
publishDate 2021-02-01
publisher IFSA Publishing, S.L.
record_format Article
series Sensors & Transducers
spelling doaj.art-b0042ff9800b44a8943f782e54bb13782023-08-07T15:52:09ZengIFSA Publishing, S.L.Sensors & Transducers2306-85151726-54792021-02-012492110118Experimental Comparison of Transformers and Reformers for Text ClassificationRoghayeh Soleymani0Julien Beaulieu1Jérémie Farret2Inmind Technologies Inc.Inmind Technologies Inc.Inmind Technologies Inc.In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with four different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformers in terms of accuracy and training speed for text classification. However, Reformers allow the training of bigger models which would otherwise cause memory failures with Transformers.https://sensorsportal.com/HTML/DIGEST/february_2021/Vol_249/P_3212.pdfnatural language processingtext classificationtransformersreformerstraxmind in a box
spellingShingle Roghayeh Soleymani
Julien Beaulieu
Jérémie Farret
Experimental Comparison of Transformers and Reformers for Text Classification
Sensors & Transducers
natural language processing
text classification
transformers
reformers
trax
mind in a box
title Experimental Comparison of Transformers and Reformers for Text Classification
title_full Experimental Comparison of Transformers and Reformers for Text Classification
title_fullStr Experimental Comparison of Transformers and Reformers for Text Classification
title_full_unstemmed Experimental Comparison of Transformers and Reformers for Text Classification
title_short Experimental Comparison of Transformers and Reformers for Text Classification
title_sort experimental comparison of transformers and reformers for text classification
topic natural language processing
text classification
transformers
reformers
trax
mind in a box
url https://sensorsportal.com/HTML/DIGEST/february_2021/Vol_249/P_3212.pdf
work_keys_str_mv AT roghayehsoleymani experimentalcomparisonoftransformersandreformersfortextclassification
AT julienbeaulieu experimentalcomparisonoftransformersandreformersfortextclassification
AT jeremiefarret experimentalcomparisonoftransformersandreformersfortextclassification