Symbolic, Distributed, and Distributional Representations for Natural Language Processing in the Era of Deep Learning: A Survey
Natural language is inherently a discrete symbolic representation of human knowledge. Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: discrete symbols are fading away, erased by vectors or tensors called distributed and distri...
Main Authors: | Lorenzo Ferrone, Fabio Massimo Zanzotto |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2020-01-01
|
Series: | Frontiers in Robotics and AI |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/frobt.2019.00153/full |
Similar Items
-
A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model
by: Tao Chen, et al.
Published: (2015-08-01) -
Graded hyponymy for compositional distributional semantics
by: Dea Bankova, et al.
Published: (2019-03-01) -
CYK Parsing over Distributed Representations
by: Fabio Massimo Zanzotto, et al.
Published: (2020-10-01) -
Compositional Distributional Semantics with Syntactic Dependencies and Selectional Preferences
by: Pablo Gamallo
Published: (2021-06-01) -
Memory Model for Morphological Semantics of Visual Stimuli Using Sparse Distributed Representation
by: Kyuchang Kang, et al.
Published: (2021-11-01)