High-order low-rank tensors for semantic role labeling
This paper introduces a tensor-based approach to semantic role labeling (SRL). The motivation behind the approach is to automatically induce a compact feature representation for words and their relations, tailoring them to the task. In this sense, our dimensionality reduction method provides a clea...
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | en_US |
Published: |
Association for Computational Linguistics
2017
|
Online Access: | http://hdl.handle.net/1721.1/110804 https://orcid.org/0000-0003-4644-3088 https://orcid.org/0000-0003-3121-0185 https://orcid.org/0000-0002-2921-8201 |
Summary: | This paper introduces a tensor-based approach to semantic role labeling (SRL). The motivation behind the approach is to automatically
induce a compact feature representation for words and their relations, tailoring them to the task. In this sense, our dimensionality reduction method provides a clear alternative to the traditional feature engineering approach
used in SRL. To capture meaningful interactions between the argument, predicate, their syntactic path and the corresponding role label,
we compress each feature representation first to a lower dimensional space prior to assessing their interactions. This corresponds to using an overall cross-product feature representation and maintaining associated parameters
as a four-way low-rank tensor. The tensor parameters are optimized for the SRL performance using standard online algorithms. Our tensor-based approach rivals the best performing system on the CoNLL-2009 shared task.
In addition, we demonstrate that adding the representation tensor to a competitive tensorfree model yields 2% absolute increase in Fscore. |
---|