Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks
As the performance of computing devices such as graphics processing units (GPUs) has improved dramatically, many deep neural network models, especially convolutional neural networks (CNNs), have been widely applied to various applications such as image classification, semantic segmentation, and obje...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10197412/ |
_version_ | 1797752031090311168 |
---|---|
author | Kyung Soo Kim Yong Suk Choi |
author_facet | Kyung Soo Kim Yong Suk Choi |
author_sort | Kyung Soo Kim |
collection | DOAJ |
description | As the performance of computing devices such as graphics processing units (GPUs) has improved dramatically, many deep neural network models, especially convolutional neural networks (CNNs), have been widely applied to various applications such as image classification, semantic segmentation, and object recognition. However, effective first-order optimization methods for CNNs have rarely been studied, although many CNN models have been successfully developed. Accordingly, this paper investigates various advanced adaptive solution search methods and proposes a new first-order optimization algorithm for CNNs called Adam-ASC. Our approach uses four sophisticated adaptive solution search methods to adjust its search strength in the complicated large-dimensional weight solution space spanned by a loss function. At the same time, we explain how they can be combined compensatively to form a complete optimizer with a detailed implementation. From the experiments, we found that our Adam-ASC can significantly improve the image recognition performance of practical CNNs in both the image classification and segmentation tasks. These experimental results show that the four fundamental methods of Adam-ASC and their compensative combination strategy play a crucial role in training CNNs by effectively finding their optimal weights. |
first_indexed | 2024-03-12T16:57:24Z |
format | Article |
id | doaj.art-230b47d8c8e5434aa8471aac54bdeb89 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-12T16:57:24Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-230b47d8c8e5434aa8471aac54bdeb892023-08-07T23:00:21ZengIEEEIEEE Access2169-35362023-01-0111806568067910.1109/ACCESS.2023.330003410197412Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural NetworksKyung Soo Kim0https://orcid.org/0000-0002-1044-3089Yong Suk Choi1https://orcid.org/0000-0002-9042-0599Department of Computer Engineering, Kumoh National Institute of Technology, Gumi, Republic of KoreaDepartment of Computer Science and Engineering, Hanyang University, Seoul, Republic of KoreaAs the performance of computing devices such as graphics processing units (GPUs) has improved dramatically, many deep neural network models, especially convolutional neural networks (CNNs), have been widely applied to various applications such as image classification, semantic segmentation, and object recognition. However, effective first-order optimization methods for CNNs have rarely been studied, although many CNN models have been successfully developed. Accordingly, this paper investigates various advanced adaptive solution search methods and proposes a new first-order optimization algorithm for CNNs called Adam-ASC. Our approach uses four sophisticated adaptive solution search methods to adjust its search strength in the complicated large-dimensional weight solution space spanned by a loss function. At the same time, we explain how they can be combined compensatively to form a complete optimizer with a detailed implementation. From the experiments, we found that our Adam-ASC can significantly improve the image recognition performance of practical CNNs in both the image classification and segmentation tasks. These experimental results show that the four fundamental methods of Adam-ASC and their compensative combination strategy play a crucial role in training CNNs by effectively finding their optimal weights.https://ieeexplore.ieee.org/document/10197412/Machine learningdeep learningconvolutional neural networksoptimization methodsgradient methodsimage classification |
spellingShingle | Kyung Soo Kim Yong Suk Choi Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks IEEE Access Machine learning deep learning convolutional neural networks optimization methods gradient methods image classification |
title | Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks |
title_full | Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks |
title_fullStr | Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks |
title_full_unstemmed | Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks |
title_short | Advanced First-Order Optimization Algorithm With Sophisticated Search Control for Convolutional Neural Networks |
title_sort | advanced first order optimization algorithm with sophisticated search control for convolutional neural networks |
topic | Machine learning deep learning convolutional neural networks optimization methods gradient methods image classification |
url | https://ieeexplore.ieee.org/document/10197412/ |
work_keys_str_mv | AT kyungsookim advancedfirstorderoptimizationalgorithmwithsophisticatedsearchcontrolforconvolutionalneuralnetworks AT yongsukchoi advancedfirstorderoptimizationalgorithmwithsophisticatedsearchcontrolforconvolutionalneuralnetworks |