Revisiting Dropout: Escaping Pressure for Training Neural Networks with Multiple Costs
A common approach to jointly learn multiple tasks with a shared structure is to optimize the model with a combined landscape of multiple sub-costs. However, gradients derived from each sub-cost often conflicts in cost plateaus, resulting in a subpar optimum. In this work, we shed light on such gradi...
Main Authors: | Sangmin Woo, Kangil Kim, Junhyug Noh, Jong-Hun Shin, Seung-Hoon Na |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-04-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/10/9/989 |
Similar Items
-
Less Is More: Adaptive Trainable Gradient Dropout for Deep Neural Networks
by: Christos Avgerinos, et al.
Published: (2023-01-01) -
Explanatory Factors of University Dropout Explored Through Artificial Intelligence
by: Juan Sebastián Parra-Sánchez, et al.
Published: (2023-06-01) -
Scoping review on interventions, actions, and policies affecting return to school and preventing school dropout in primary school
by: Ayoub Eslamian, et al.
Published: (2023-01-01) -
Keeping students in school : a guide to effective dropout prevention programs and services /
by: 362724 Orr, Margaret Terry
Published: (1987) -
Psychometric properties of a dropout prediction tool for students in Andalusia
by: Antonio Hernández-Fernández, et al.
Published: (2023-07-01)