Systematic comparison of neural architectures and training approaches for open information extraction

The goal of open information extraction (OIE) is to extract facts from natural language text, and to represent them as structured triples of the form (subject, predicate, object). For example, given the sentence »Beethoven composed the Ode to Joy.«, we are expected to extract the triple (Beethoven,...

Full description

Bibliographic Details
Main Authors: Hohenecker, P, Mtumbuka, F, Kocijan, V, Lukasiewicz, T
Format: Conference item
Language:English
Published: Association for Computational Linguistics 2020
_version_ 1826302313694756864
author Hohenecker, P
Mtumbuka, F
Kocijan, V
Lukasiewicz, T
author_facet Hohenecker, P
Mtumbuka, F
Kocijan, V
Lukasiewicz, T
author_sort Hohenecker, P
collection OXFORD
description The goal of open information extraction (OIE) is to extract facts from natural language text, and to represent them as structured triples of the form (subject, predicate, object). For example, given the sentence »Beethoven composed the Ode to Joy.«, we are expected to extract the triple (Beethoven, composed, Ode to Joy). In this work, we systematically compare different neural network architectures and training approaches, and improve the performance of the currently best models on the OIE16 benchmark (Stanovsky and Dagan, 2016) by 0.421 F1 score and 0.420 AUCPR, respectively, in our experiments (i.e., by more than 200% in both cases). Furthermore, we show that appropriate problem and loss formulations often affect the performance more than the network architecture.
first_indexed 2024-03-07T05:45:43Z
format Conference item
id oxford-uuid:e7231759-fcd8-404e-9a8e-e79c789846a4
institution University of Oxford
language English
last_indexed 2024-03-07T05:45:43Z
publishDate 2020
publisher Association for Computational Linguistics
record_format dspace
spelling oxford-uuid:e7231759-fcd8-404e-9a8e-e79c789846a42022-03-27T10:36:27ZSystematic comparison of neural architectures and training approaches for open information extractionConference itemhttp://purl.org/coar/resource_type/c_5794uuid:e7231759-fcd8-404e-9a8e-e79c789846a4EnglishSymplectic ElementsAssociation for Computational Linguistics2020Hohenecker, PMtumbuka, FKocijan, VLukasiewicz, TThe goal of open information extraction (OIE) is to extract facts from natural language text, and to represent them as structured triples of the form (subject, predicate, object). For example, given the sentence »Beethoven composed the Ode to Joy.«, we are expected to extract the triple (Beethoven, composed, Ode to Joy). In this work, we systematically compare different neural network architectures and training approaches, and improve the performance of the currently best models on the OIE16 benchmark (Stanovsky and Dagan, 2016) by 0.421 F1 score and 0.420 AUCPR, respectively, in our experiments (i.e., by more than 200% in both cases). Furthermore, we show that appropriate problem and loss formulations often affect the performance more than the network architecture.
spellingShingle Hohenecker, P
Mtumbuka, F
Kocijan, V
Lukasiewicz, T
Systematic comparison of neural architectures and training approaches for open information extraction
title Systematic comparison of neural architectures and training approaches for open information extraction
title_full Systematic comparison of neural architectures and training approaches for open information extraction
title_fullStr Systematic comparison of neural architectures and training approaches for open information extraction
title_full_unstemmed Systematic comparison of neural architectures and training approaches for open information extraction
title_short Systematic comparison of neural architectures and training approaches for open information extraction
title_sort systematic comparison of neural architectures and training approaches for open information extraction
work_keys_str_mv AT hoheneckerp systematiccomparisonofneuralarchitecturesandtrainingapproachesforopeninformationextraction
AT mtumbukaf systematiccomparisonofneuralarchitecturesandtrainingapproachesforopeninformationextraction
AT kocijanv systematiccomparisonofneuralarchitecturesandtrainingapproachesforopeninformationextraction
AT lukasiewiczt systematiccomparisonofneuralarchitecturesandtrainingapproachesforopeninformationextraction