Revisiting Dropout: Escaping Pressure for Training Neural Networks with Multiple Costs

A common approach to jointly learn multiple tasks with a shared structure is to optimize the model with a combined landscape of multiple sub-costs. However, gradients derived from each sub-cost often conflicts in cost plateaus, resulting in a subpar optimum. In this work, we shed light on such gradi...

Full description

Bibliographic Details
Main Authors: Sangmin Woo, Kangil Kim, Junhyug Noh, Jong-Hun Shin, Seung-Hoon Na
Format: Article
Language:English
Published: MDPI AG 2021-04-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/10/9/989