Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension
Machine reading comprehension (MRC) is a cutting-edge technology in natural language processing (NLP), which focuses on teaching machines to read and understand the meaning of texts based on the emergence of large-scale datasets and neural network models. Recently, with the successful development of...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10190566/ |
_version_ | 1797746142293786624 |
---|---|
author | Tuan-Anh Phan Jason J. Jung Khac-Hoai Nam Bui |
author_facet | Tuan-Anh Phan Jason J. Jung Khac-Hoai Nam Bui |
author_sort | Tuan-Anh Phan |
collection | DOAJ |
description | Machine reading comprehension (MRC) is a cutting-edge technology in natural language processing (NLP), which focuses on teaching machines to read and understand the meaning of texts based on the emergence of large-scale datasets and neural network models. Recently, with the successful development of pre-trained transformer models (e.g., BERT), MRC has advanced significantly, surpassing human parity in several public datasets and being applied in various NLP tasks (e.g., QA systems). Nevertheless, long document MRC is still a remain challenge since the transformer-based models are limited by the input length. For instance, several well-known pre-trained language models such as BERT and RoBERTa are limited by 512 tokens. This study aims to provide a new simple approach for long document MRC. Specifically, recent state-of-the-art models follow the architecture with two crucial stages for reading long texts in order to enable local and global context representations. In this study, we present a new architecture that is able to enrich the global information of the context with one stage by exploiting the interaction of different levels of semantic units of the context (i.e., sentence and word level). Therefore, we name the proposed model as RAiO (Read-All-in-Once) approach. For the experiment, we evaluate RAiO on two benchmark long document MRC datasets such as NewsQA and NLQuAD. Accordingly, the experiment shows promising results of the proposed approach compared with strong baselines in this research field. |
first_indexed | 2024-03-12T15:32:48Z |
format | Article |
id | doaj.art-c56b06d1d9a74b87aa5f0f9084a65f5a |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-12T15:32:48Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-c56b06d1d9a74b87aa5f0f9084a65f5a2023-08-09T23:01:23ZengIEEEIEEE Access2169-35362023-01-0111778737787910.1109/ACCESS.2023.329810010190566Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading ComprehensionTuan-Anh Phan0https://orcid.org/0009-0003-7936-1385Jason J. Jung1https://orcid.org/0000-0003-0050-7445Khac-Hoai Nam Bui2https://orcid.org/0000-0002-3427-8460Viettel Cyberspace Center, Viettel Group, Hanoi, VietnamDepartment of Computer Engineering, Chung-Ang University, Seoul, South KoreaViettel Cyberspace Center, Viettel Group, Hanoi, VietnamMachine reading comprehension (MRC) is a cutting-edge technology in natural language processing (NLP), which focuses on teaching machines to read and understand the meaning of texts based on the emergence of large-scale datasets and neural network models. Recently, with the successful development of pre-trained transformer models (e.g., BERT), MRC has advanced significantly, surpassing human parity in several public datasets and being applied in various NLP tasks (e.g., QA systems). Nevertheless, long document MRC is still a remain challenge since the transformer-based models are limited by the input length. For instance, several well-known pre-trained language models such as BERT and RoBERTa are limited by 512 tokens. This study aims to provide a new simple approach for long document MRC. Specifically, recent state-of-the-art models follow the architecture with two crucial stages for reading long texts in order to enable local and global context representations. In this study, we present a new architecture that is able to enrich the global information of the context with one stage by exploiting the interaction of different levels of semantic units of the context (i.e., sentence and word level). Therefore, we name the proposed model as RAiO (Read-All-in-Once) approach. For the experiment, we evaluate RAiO on two benchmark long document MRC datasets such as NewsQA and NLQuAD. Accordingly, the experiment shows promising results of the proposed approach compared with strong baselines in this research field.https://ieeexplore.ieee.org/document/10190566/Natural language processinglong-text machine reading comprehensionquestion-answering system |
spellingShingle | Tuan-Anh Phan Jason J. Jung Khac-Hoai Nam Bui Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension IEEE Access Natural language processing long-text machine reading comprehension question-answering system |
title | Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension |
title_full | Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension |
title_fullStr | Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension |
title_full_unstemmed | Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension |
title_short | Read-All-in-Once (RAiO): Multi-Layer Contextual Architecture for Long-Text Machine Reading Comprehension |
title_sort | read all in once raio multi layer contextual architecture for long text machine reading comprehension |
topic | Natural language processing long-text machine reading comprehension question-answering system |
url | https://ieeexplore.ieee.org/document/10190566/ |
work_keys_str_mv | AT tuananhphan readallinonceraiomultilayercontextualarchitectureforlongtextmachinereadingcomprehension AT jasonjjung readallinonceraiomultilayercontextualarchitectureforlongtextmachinereadingcomprehension AT khachoainambui readallinonceraiomultilayercontextualarchitectureforlongtextmachinereadingcomprehension |