Embedding Logic Rules Into Recurrent Neural Networks

Incorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rul...

Full description

Bibliographic Details
Main Authors: Bingfeng Chen, Zhifeng Hao, Xiaofeng Cai, Ruichu Cai, Wen Wen, Jian Zhu, Guangqiang Xie
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8610074/
_version_ 1818874853493571584
author Bingfeng Chen
Zhifeng Hao
Xiaofeng Cai
Ruichu Cai
Wen Wen
Jian Zhu
Guangqiang Xie
author_facet Bingfeng Chen
Zhifeng Hao
Xiaofeng Cai
Ruichu Cai
Wen Wen
Jian Zhu
Guangqiang Xie
author_sort Bingfeng Chen
collection DOAJ
description Incorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rules from the structured knowledge and embedding the extracted logic rule into the RNN, this paper proposes an effective framework to incorporate the prior information in the RNN models. First, we demonstrate that commonly used prior knowledge could be decomposed into a set of logic rules, including the knowledge graph, social graph, and syntactic dependence. Second, we present a technique to embed a set of logic rules into the RNN by the way of feedback masks. Finally, we apply the proposed approach to the sentiment classification and named entity recognition task. The extensive experimental results verify the effectiveness of the embedding approach. The encouraging results suggest that the proposed approach has the potential for applications in other NLP tasks.
first_indexed 2024-12-19T13:17:12Z
format Article
id doaj.art-ad373f848d954e51b66d2899fc3d0bdd
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-19T13:17:12Z
publishDate 2019-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-ad373f848d954e51b66d2899fc3d0bdd2022-12-21T20:19:48ZengIEEEIEEE Access2169-35362019-01-017149381494610.1109/ACCESS.2019.28921408610074Embedding Logic Rules Into Recurrent Neural NetworksBingfeng Chen0https://orcid.org/0000-0002-9449-5424Zhifeng Hao1https://orcid.org/0000-0002-9257-2895Xiaofeng Cai2Ruichu Cai3https://orcid.org/0000-0001-8972-167XWen Wen4Jian Zhu5Guangqiang Xie6Department of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaIncorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rules from the structured knowledge and embedding the extracted logic rule into the RNN, this paper proposes an effective framework to incorporate the prior information in the RNN models. First, we demonstrate that commonly used prior knowledge could be decomposed into a set of logic rules, including the knowledge graph, social graph, and syntactic dependence. Second, we present a technique to embed a set of logic rules into the RNN by the way of feedback masks. Finally, we apply the proposed approach to the sentiment classification and named entity recognition task. The extensive experimental results verify the effectiveness of the embedding approach. The encouraging results suggest that the proposed approach has the potential for applications in other NLP tasks.https://ieeexplore.ieee.org/document/8610074/RNNlogic rulessentiment classificationnamed entity recognition
spellingShingle Bingfeng Chen
Zhifeng Hao
Xiaofeng Cai
Ruichu Cai
Wen Wen
Jian Zhu
Guangqiang Xie
Embedding Logic Rules Into Recurrent Neural Networks
IEEE Access
RNN
logic rules
sentiment classification
named entity recognition
title Embedding Logic Rules Into Recurrent Neural Networks
title_full Embedding Logic Rules Into Recurrent Neural Networks
title_fullStr Embedding Logic Rules Into Recurrent Neural Networks
title_full_unstemmed Embedding Logic Rules Into Recurrent Neural Networks
title_short Embedding Logic Rules Into Recurrent Neural Networks
title_sort embedding logic rules into recurrent neural networks
topic RNN
logic rules
sentiment classification
named entity recognition
url https://ieeexplore.ieee.org/document/8610074/
work_keys_str_mv AT bingfengchen embeddinglogicrulesintorecurrentneuralnetworks
AT zhifenghao embeddinglogicrulesintorecurrentneuralnetworks
AT xiaofengcai embeddinglogicrulesintorecurrentneuralnetworks
AT ruichucai embeddinglogicrulesintorecurrentneuralnetworks
AT wenwen embeddinglogicrulesintorecurrentneuralnetworks
AT jianzhu embeddinglogicrulesintorecurrentneuralnetworks
AT guangqiangxie embeddinglogicrulesintorecurrentneuralnetworks