Capsule Network Improved Multi-Head Attention for Word Sense Disambiguation
Word sense disambiguation (WSD) is one of the core problems in natural language processing (NLP), which is to map an ambiguous word to its correct meaning in a specific context. There has been a lively interest in incorporating sense definition (gloss) into neural networks in recent studies, which m...
Main Authors: | Jinfeng Cheng, Weiqin Tong, Weian Yan |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-03-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/11/6/2488 |
Similar Items
-
Multi-Head Self-Attention Gated-Dilated Convolutional Neural Network for Word Sense Disambiguation
by: Chun-Xiang Zhang, et al.
Published: (2023-01-01) -
Word Sense Disambiguation Based on RegNet With Efficient Channel Attention and Dilated Convolution
by: Chun-Xiang Zhang, et al.
Published: (2023-01-01) -
Biomedical word sense disambiguation with bidirectional long short-term memory and attention-based neural networks
by: Canlin Zhang, et al.
Published: (2019-12-01) -
Unsupervised Word Sense Disambiguation Using Word Embeddings
by: Behzad Moradi, et al.
Published: (2019-11-01) -
Biomedical Word Sense Disambiguation Based on Graph Attention Networks
by: Chun-Xiang Zhang, et al.
Published: (2022-01-01)