Sequence to better sequence: Continuous revision of combinatorial structures

© 2017 by the author(s). We present a model that, after learning on observations of (sequence, outcome) pairs, can be efficiently used to revise a new sequence in order to improve its associated outcome. Our framework requires neither example improvements, nor additional evaluation of outcomes for p...

Full description

Bibliographic Details
Main Authors: Jaakkola, Tommi, Gifford, David, Mueller, Jonas
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:English
Published: 2021
Online Access:https://hdl.handle.net/1721.1/137633
_version_ 1826194758688571392
author Jaakkola, Tommi
Gifford, David
Mueller, Jonas
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Jaakkola, Tommi
Gifford, David
Mueller, Jonas
author_sort Jaakkola, Tommi
collection MIT
description © 2017 by the author(s). We present a model that, after learning on observations of (sequence, outcome) pairs, can be efficiently used to revise a new sequence in order to improve its associated outcome. Our framework requires neither example improvements, nor additional evaluation of outcomes for proposed revisions. To avoid combinatorial-search over sequence elements, we specify a generative model with continuous latent factors, which is learned via joint approximate inference using a recurrent variational autoencoder (VAE) and an outcome-predicting neural network module. Under this model, gradient methods can be used to efficiently optimize the continuous latent factors with respect to inferred outcomes. By appropriately constraining this optimization and using the VAE decoder to generate a revised sequence, we ensure the revision is fundamentally similar to the original sequence, is associated with better outcomes, and looks natural. These desiderata are proven to hold with high probability under our approach, which is empirically demonstrated for revising natural language sentences.
first_indexed 2024-09-23T10:01:40Z
format Article
id mit-1721.1/137633
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T10:01:40Z
publishDate 2021
record_format dspace
spelling mit-1721.1/1376332023-04-07T19:45:52Z Sequence to better sequence: Continuous revision of combinatorial structures Jaakkola, Tommi Gifford, David Mueller, Jonas Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory © 2017 by the author(s). We present a model that, after learning on observations of (sequence, outcome) pairs, can be efficiently used to revise a new sequence in order to improve its associated outcome. Our framework requires neither example improvements, nor additional evaluation of outcomes for proposed revisions. To avoid combinatorial-search over sequence elements, we specify a generative model with continuous latent factors, which is learned via joint approximate inference using a recurrent variational autoencoder (VAE) and an outcome-predicting neural network module. Under this model, gradient methods can be used to efficiently optimize the continuous latent factors with respect to inferred outcomes. By appropriately constraining this optimization and using the VAE decoder to generate a revised sequence, we ensure the revision is fundamentally similar to the original sequence, is associated with better outcomes, and looks natural. These desiderata are proven to hold with high probability under our approach, which is empirically demonstrated for revising natural language sentences. 2021-11-08T12:44:49Z 2021-11-08T12:44:49Z 2017 2019-05-29T14:20:31Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137633 Jaakkola, Tommi, Gifford, David and Mueller, Jonas. 2017. "Sequence to better sequence: Continuous revision of combinatorial structures." en http://proceedings.mlr.press/v70/mueller17a.html Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf MIT web domain
spellingShingle Jaakkola, Tommi
Gifford, David
Mueller, Jonas
Sequence to better sequence: Continuous revision of combinatorial structures
title Sequence to better sequence: Continuous revision of combinatorial structures
title_full Sequence to better sequence: Continuous revision of combinatorial structures
title_fullStr Sequence to better sequence: Continuous revision of combinatorial structures
title_full_unstemmed Sequence to better sequence: Continuous revision of combinatorial structures
title_short Sequence to better sequence: Continuous revision of combinatorial structures
title_sort sequence to better sequence continuous revision of combinatorial structures
url https://hdl.handle.net/1721.1/137633
work_keys_str_mv AT jaakkolatommi sequencetobettersequencecontinuousrevisionofcombinatorialstructures
AT gifforddavid sequencetobettersequencecontinuousrevisionofcombinatorialstructures
AT muellerjonas sequencetobettersequencecontinuousrevisionofcombinatorialstructures