Learning to Balance Local Losses via Meta-Learning
The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In thi...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9541196/ |
_version_ | 1811210488725372928 |
---|---|
author | Seungdong Yoa Minkyu Jeon Youngjin Oh Hyunwoo J. Kim |
author_facet | Seungdong Yoa Minkyu Jeon Youngjin Oh Hyunwoo J. Kim |
author_sort | Seungdong Yoa |
collection | DOAJ |
description | The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In this paper, we propose a general framework that learns to adaptively train each layer of deep neural networks via meta-learning. Our framework leverages the local error signals from layers and identifies which layer needs to be trained more at every iteration. Also, the proposed method improves the local loss function with our minibatch-wise dropout and cross-validation loop to alleviate meta-overfitting. The experiments show that our method achieved competitive performance compared to state-of-the-art methods on popular benchmark datasets for image classification: CIFAR-10 and CIFAR-100. Surprisingly, our method enables training deep neural networks without skip-connections using dynamically weighted local loss functions. |
first_indexed | 2024-04-12T04:56:34Z |
format | Article |
id | doaj.art-17727a9947094a4184d6a5dc3b1b7549 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-12T04:56:34Z |
publishDate | 2021-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-17727a9947094a4184d6a5dc3b1b75492022-12-22T03:47:07ZengIEEEIEEE Access2169-35362021-01-01913083413084410.1109/ACCESS.2021.31139349541196Learning to Balance Local Losses via Meta-LearningSeungdong Yoa0https://orcid.org/0000-0002-2982-0884Minkyu Jeon1https://orcid.org/0000-0003-0572-6065Youngjin Oh2https://orcid.org/0000-0003-1546-8469Hyunwoo J. Kim3https://orcid.org/0000-0002-2181-9264Department of Computer Science, Korea University, Seoul, Republic of KoreaDepartment of Computer Science, Korea University, Seoul, Republic of KoreaDepartment of Computer Science, Korea University, Seoul, Republic of KoreaDepartment of Computer Science, Korea University, Seoul, Republic of KoreaThe standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In this paper, we propose a general framework that learns to adaptively train each layer of deep neural networks via meta-learning. Our framework leverages the local error signals from layers and identifies which layer needs to be trained more at every iteration. Also, the proposed method improves the local loss function with our minibatch-wise dropout and cross-validation loop to alleviate meta-overfitting. The experiments show that our method achieved competitive performance compared to state-of-the-art methods on popular benchmark datasets for image classification: CIFAR-10 and CIFAR-100. Surprisingly, our method enables training deep neural networks without skip-connections using dynamically weighted local loss functions.https://ieeexplore.ieee.org/document/9541196/Deep learningimage classificationmachine learningmeta-learning |
spellingShingle | Seungdong Yoa Minkyu Jeon Youngjin Oh Hyunwoo J. Kim Learning to Balance Local Losses via Meta-Learning IEEE Access Deep learning image classification machine learning meta-learning |
title | Learning to Balance Local Losses via Meta-Learning |
title_full | Learning to Balance Local Losses via Meta-Learning |
title_fullStr | Learning to Balance Local Losses via Meta-Learning |
title_full_unstemmed | Learning to Balance Local Losses via Meta-Learning |
title_short | Learning to Balance Local Losses via Meta-Learning |
title_sort | learning to balance local losses via meta learning |
topic | Deep learning image classification machine learning meta-learning |
url | https://ieeexplore.ieee.org/document/9541196/ |
work_keys_str_mv | AT seungdongyoa learningtobalancelocallossesviametalearning AT minkyujeon learningtobalancelocallossesviametalearning AT youngjinoh learningtobalancelocallossesviametalearning AT hyunwoojkim learningtobalancelocallossesviametalearning |