Learning to Balance Local Losses via Meta-Learning
The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In thi...
Main Authors: | Seungdong Yoa, Minkyu Jeon, Youngjin Oh, Hyunwoo J. Kim |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9541196/ |
Similar Items
-
Learning Non-Parametric Surrogate Losses With Correlated Gradients
by: Seungdong Yoa, et al.
Published: (2021-01-01) -
Self-Supervised Learning for Anomaly Detection With Dynamic Local Augmentation
by: Seungdong Yoa, et al.
Published: (2021-01-01) -
The Balanced Loss Curriculum Learning
by: Wei Qin, et al.
Published: (2020-01-01) -
Enhancing Model Agnostic Meta-Learning via Gradient Similarity Loss
by: Jae-Ho Tak, et al.
Published: (2024-01-01) -
MetaSeg: A survey of meta-learning for image segmentation
by: Jiaxing Sun, et al.
Published: (2021-01-01)