HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks

As the performance of devices that conduct large-scale computations has been rapidly improved, various deep learning models have been successfully utilized in various applications. Particularly, convolution neural networks (CNN) have shown remarkable performance in image processing tasks such as ima...

Full description

Bibliographic Details
Main Authors: Kyung-Soo Kim, Yong-Suk Choi
Format: Article
Language:English
Published: MDPI AG 2021-06-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/21/12/4054
_version_ 1797530363338162176
author Kyung-Soo Kim
Yong-Suk Choi
author_facet Kyung-Soo Kim
Yong-Suk Choi
author_sort Kyung-Soo Kim
collection DOAJ
description As the performance of devices that conduct large-scale computations has been rapidly improved, various deep learning models have been successfully utilized in various applications. Particularly, convolution neural networks (CNN) have shown remarkable performance in image processing tasks such as image classification and segmentation. Accordingly, more stable and robust optimization methods are required to effectively train them. However, the traditional optimizers used in deep learning still have unsatisfactory training performance for the models with many layers and weights. Accordingly, in this paper, we propose a new Adam-based hybrid optimization method called HyAdamC for training CNNs effectively. HyAdamC uses three new velocity control functions to adjust its search strength carefully in term of initial, short, and long-term velocities. Moreover, HyAdamC utilizes an adaptive coefficient computation method to prevent that a search direction determined by the first momentum is distorted by any outlier gradients. Then, these are combined into one hybrid method. In our experiments, HyAdamC showed not only notable test accuracies but also significantly stable and robust optimization abilities when training various CNN models. Furthermore, we also found that HyAdamC could be applied into not only image classification and image segmentation tasks.
first_indexed 2024-03-10T10:28:49Z
format Article
id doaj.art-efb91a3ea17940dd8733fbaa6098bd91
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-10T10:28:49Z
publishDate 2021-06-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-efb91a3ea17940dd8733fbaa6098bd912023-11-21T23:52:26ZengMDPI AGSensors1424-82202021-06-012112405410.3390/s21124054HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural NetworksKyung-Soo Kim0Yong-Suk Choi1Center for Computational Social Science, Hanyang University, Seoul 04763, KoreaDepartment of Computer Science and Engineering, Hanyang University, Seoul 04763, KoreaAs the performance of devices that conduct large-scale computations has been rapidly improved, various deep learning models have been successfully utilized in various applications. Particularly, convolution neural networks (CNN) have shown remarkable performance in image processing tasks such as image classification and segmentation. Accordingly, more stable and robust optimization methods are required to effectively train them. However, the traditional optimizers used in deep learning still have unsatisfactory training performance for the models with many layers and weights. Accordingly, in this paper, we propose a new Adam-based hybrid optimization method called HyAdamC for training CNNs effectively. HyAdamC uses three new velocity control functions to adjust its search strength carefully in term of initial, short, and long-term velocities. Moreover, HyAdamC utilizes an adaptive coefficient computation method to prevent that a search direction determined by the first momentum is distorted by any outlier gradients. Then, these are combined into one hybrid method. In our experiments, HyAdamC showed not only notable test accuracies but also significantly stable and robust optimization abilities when training various CNN models. Furthermore, we also found that HyAdamC could be applied into not only image classification and image segmentation tasks.https://www.mdpi.com/1424-8220/21/12/4054deep learningoptimizationfirst-order optimizationgradient descentadam optimizationconvolution neural networks
spellingShingle Kyung-Soo Kim
Yong-Suk Choi
HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
Sensors
deep learning
optimization
first-order optimization
gradient descent
adam optimization
convolution neural networks
title HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_full HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_fullStr HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_full_unstemmed HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_short HyAdamC: A New Adam-Based Hybrid Optimization Algorithm for Convolution Neural Networks
title_sort hyadamc a new adam based hybrid optimization algorithm for convolution neural networks
topic deep learning
optimization
first-order optimization
gradient descent
adam optimization
convolution neural networks
url https://www.mdpi.com/1424-8220/21/12/4054
work_keys_str_mv AT kyungsookim hyadamcanewadambasedhybridoptimizationalgorithmforconvolutionneuralnetworks
AT yongsukchoi hyadamcanewadambasedhybridoptimizationalgorithmforconvolutionneuralnetworks