On Decoding Strategies for Neural Text Generators

AbstractWhen generating text from probabilistic models, the chosen decoding strategy has a profound effect on the resulting text. Yet the properties elicited by various decoding strategies do not always transfer across natural language generation tasks. For example, while mode-seekin...

Full description

Bibliographic Details
Main Authors: Gian Wiher, Clara Meister, Ryan Cotterell
Format: Article
Language:English
Published: The MIT Press 2022-01-01
Series:Transactions of the Association for Computational Linguistics
Online Access:https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00502/113024/On-Decoding-Strategies-for-Neural-Text-Generators

Similar Items