Dependency-based Siamese long short-term memory network for learning sentence representations.
Textual representations play an important role in the field of natural language processing (NLP). The efficiency of NLP tasks, such as text comprehension and information extraction, can be significantly improved with proper textual representations. As neural networks are gradually applied to learn t...
Main Authors: | Wenhao Zhu, Tengjun Yao, Jianyue Ni, Baogang Wei, Zhiguo Lu |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2018-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC5841810?pdf=render |
Similar Items
-
Improving Sentence Representations via Component Focusing
by: Xiaoya Yin, et al.
Published: (2020-02-01) -
Improve word embedding using both writing and pronunciation.
by: Wenhao Zhu, et al.
Published: (2018-01-01) -
Short-term memory and sentence comprehension in Catalan aphasia
by: Io Salmons, et al.
Published: (2022-10-01) -
Short-term recall of sentences : conceptual representation or plausible reconstruction
by: Johnson, Susan Carol
Published: (2005) -
An Enhanced Focused Web Crawler for Biomedical Topics Using Attention Enhanced Siamese Long Short Term Memory Networks
by: Joe Dhanith Pal Nesamony Rose Mary, et al.
Published: (2022-01-01)