A More Robust Model to Answer Noisy Questions in KBQA

In practical applications, the raw input to a Knowledge Based Question Answering (KBQA) system may vary in forms, expressions, sources, etc. As a result, the actual input to the system may contain various errors caused by various noise in raw data and processes of transmission, transformation, trans...

Full description

Bibliographic Details
Main Authors: Ziming Wang, Xirong Xu, Xinzi Li, Haochen Li, Li Zhu, Xiaopeng Wei
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10058914/
_version_ 1811155566837366784
author Ziming Wang
Xirong Xu
Xinzi Li
Haochen Li
Li Zhu
Xiaopeng Wei
author_facet Ziming Wang
Xirong Xu
Xinzi Li
Haochen Li
Li Zhu
Xiaopeng Wei
author_sort Ziming Wang
collection DOAJ
description In practical applications, the raw input to a Knowledge Based Question Answering (KBQA) system may vary in forms, expressions, sources, etc. As a result, the actual input to the system may contain various errors caused by various noise in raw data and processes of transmission, transformation, translation, etc. As a result, it is significant to evaluate and enhance the robustness of a KBQA model to various noisy questions. In this paper, we generate 29 datasets of various noisy questions based on the original SimpleQuestions dataset to evaluate and enhance the robustness of a KBQA model, and propose a model which is more robust to various noisy questions. Compared with traditional methods, the main contribution in this paper is that we propose a method of generating datasets of different noisy questions to evaluate the robustness of a KBQA model, and propose a KBQA model which contains incremental learning and Mask Language Model (MLM) in the question answering process, so that our model is less affected by different kinds of noise in questions and achieves higher accuracies on datasets of different noisy questions, which shows its robustness. Experimental results show that our model achieves an average accuracy of 78.1% on these datasets and outperforms the baseline BERT-based model by an average margin of 5.0% with the similar training cost. In addition, further experiments show that our model is compatible with other pre-trained models such as ALBERT and ELECTRA.
first_indexed 2024-04-10T04:36:42Z
format Article
id doaj.art-a63f73bae3b24562bf33feaed1403d9b
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-10T04:36:42Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-a63f73bae3b24562bf33feaed1403d9b2023-03-10T00:00:39ZengIEEEIEEE Access2169-35362023-01-0111227562276610.1109/ACCESS.2023.325260810058914A More Robust Model to Answer Noisy Questions in KBQAZiming Wang0Xirong Xu1https://orcid.org/0000-0002-7558-3031Xinzi Li2Haochen Li3Li Zhu4Xiaopeng Wei5https://orcid.org/0000-0002-8497-611XSchool of Computer Science and Technology, Dalian University of Technology, Dalian, ChinaSchool of Computer Science and Technology, Dalian University of Technology, Dalian, ChinaSchool of Computer Science and Technology, Dalian University of Technology, Dalian, ChinaSchool of Computer Science and Technology, Dalian University of Technology, Dalian, ChinaSchool of Control Science and Engineering, Dalian University of Technology, Dalian, ChinaSchool of Computer Science and Technology, Dalian University of Technology, Dalian, ChinaIn practical applications, the raw input to a Knowledge Based Question Answering (KBQA) system may vary in forms, expressions, sources, etc. As a result, the actual input to the system may contain various errors caused by various noise in raw data and processes of transmission, transformation, translation, etc. As a result, it is significant to evaluate and enhance the robustness of a KBQA model to various noisy questions. In this paper, we generate 29 datasets of various noisy questions based on the original SimpleQuestions dataset to evaluate and enhance the robustness of a KBQA model, and propose a model which is more robust to various noisy questions. Compared with traditional methods, the main contribution in this paper is that we propose a method of generating datasets of different noisy questions to evaluate the robustness of a KBQA model, and propose a KBQA model which contains incremental learning and Mask Language Model (MLM) in the question answering process, so that our model is less affected by different kinds of noise in questions and achieves higher accuracies on datasets of different noisy questions, which shows its robustness. Experimental results show that our model achieves an average accuracy of 78.1% on these datasets and outperforms the baseline BERT-based model by an average margin of 5.0% with the similar training cost. In addition, further experiments show that our model is compatible with other pre-trained models such as ALBERT and ELECTRA.https://ieeexplore.ieee.org/document/10058914/Incremental learningknowledge base question answeringmachine learningnatural language processingrelation predictionrobustness
spellingShingle Ziming Wang
Xirong Xu
Xinzi Li
Haochen Li
Li Zhu
Xiaopeng Wei
A More Robust Model to Answer Noisy Questions in KBQA
IEEE Access
Incremental learning
knowledge base question answering
machine learning
natural language processing
relation prediction
robustness
title A More Robust Model to Answer Noisy Questions in KBQA
title_full A More Robust Model to Answer Noisy Questions in KBQA
title_fullStr A More Robust Model to Answer Noisy Questions in KBQA
title_full_unstemmed A More Robust Model to Answer Noisy Questions in KBQA
title_short A More Robust Model to Answer Noisy Questions in KBQA
title_sort more robust model to answer noisy questions in kbqa
topic Incremental learning
knowledge base question answering
machine learning
natural language processing
relation prediction
robustness
url https://ieeexplore.ieee.org/document/10058914/
work_keys_str_mv AT zimingwang amorerobustmodeltoanswernoisyquestionsinkbqa
AT xirongxu amorerobustmodeltoanswernoisyquestionsinkbqa
AT xinzili amorerobustmodeltoanswernoisyquestionsinkbqa
AT haochenli amorerobustmodeltoanswernoisyquestionsinkbqa
AT lizhu amorerobustmodeltoanswernoisyquestionsinkbqa
AT xiaopengwei amorerobustmodeltoanswernoisyquestionsinkbqa
AT zimingwang morerobustmodeltoanswernoisyquestionsinkbqa
AT xirongxu morerobustmodeltoanswernoisyquestionsinkbqa
AT xinzili morerobustmodeltoanswernoisyquestionsinkbqa
AT haochenli morerobustmodeltoanswernoisyquestionsinkbqa
AT lizhu morerobustmodeltoanswernoisyquestionsinkbqa
AT xiaopengwei morerobustmodeltoanswernoisyquestionsinkbqa