Control the number of skip‐connects to improve robustness of the NAS algorithm
Abstract Recently, the gradient‐based neural architecture search has made remarkable progress with the characteristics of high efficiency and fast convergence. However, two common problems in the gradient‐based NAS algorithms are found. First, with the increase in the raining time, the NAS algorithm...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-08-01
|
Series: | IET Computer Vision |
Subjects: | |
Online Access: | https://doi.org/10.1049/cvi2.12036 |
_version_ | 1811319966776950784 |
---|---|
author | Bao Feng Zhang Guo Qiang Zhou |
author_facet | Bao Feng Zhang Guo Qiang Zhou |
author_sort | Bao Feng Zhang |
collection | DOAJ |
description | Abstract Recently, the gradient‐based neural architecture search has made remarkable progress with the characteristics of high efficiency and fast convergence. However, two common problems in the gradient‐based NAS algorithms are found. First, with the increase in the raining time, the NAS algorithm tends to skip‐connect operation, leading to performance degradation and instability results. Second, another problem is no reasonable allocation of computing resources on valuable candidate network models. The above two points lead to the difficulty in searching the optimal sub‐network and poor stability. To address them, the trick of pre‐training the super‐net is applied, so that each operation has an equal opportunity to develop its strength, which provides a fair competition condition for the convergence of the architecture parameters. In addition, a skip‐controller is proposed to ensure each sampled sub‐network with an appropriate number of skip‐connects. The experiments were performed on three mainstream datasets CIFAR‐10, CIFAR‐100 and ImageNet, in which the improved method achieves comparable results with higher accuracy and stronger robustness. |
first_indexed | 2024-04-13T12:51:29Z |
format | Article |
id | doaj.art-b056490a2bc543feba47037007e819e0 |
institution | Directory Open Access Journal |
issn | 1751-9632 1751-9640 |
language | English |
last_indexed | 2024-04-13T12:51:29Z |
publishDate | 2021-08-01 |
publisher | Wiley |
record_format | Article |
series | IET Computer Vision |
spelling | doaj.art-b056490a2bc543feba47037007e819e02022-12-22T02:46:12ZengWileyIET Computer Vision1751-96321751-96402021-08-0115535636510.1049/cvi2.12036Control the number of skip‐connects to improve robustness of the NAS algorithmBao Feng Zhang0Guo Qiang Zhou1School of Computer Science Nanjing University of Posts and Telecommunications Nanjing ChinaSchool of Computer Science Nanjing University of Posts and Telecommunications Nanjing ChinaAbstract Recently, the gradient‐based neural architecture search has made remarkable progress with the characteristics of high efficiency and fast convergence. However, two common problems in the gradient‐based NAS algorithms are found. First, with the increase in the raining time, the NAS algorithm tends to skip‐connect operation, leading to performance degradation and instability results. Second, another problem is no reasonable allocation of computing resources on valuable candidate network models. The above two points lead to the difficulty in searching the optimal sub‐network and poor stability. To address them, the trick of pre‐training the super‐net is applied, so that each operation has an equal opportunity to develop its strength, which provides a fair competition condition for the convergence of the architecture parameters. In addition, a skip‐controller is proposed to ensure each sampled sub‐network with an appropriate number of skip‐connects. The experiments were performed on three mainstream datasets CIFAR‐10, CIFAR‐100 and ImageNet, in which the improved method achieves comparable results with higher accuracy and stronger robustness.https://doi.org/10.1049/cvi2.12036gradient methodsimage classificationlearning (artificial intelligence)neural netsunsupervised learning |
spellingShingle | Bao Feng Zhang Guo Qiang Zhou Control the number of skip‐connects to improve robustness of the NAS algorithm IET Computer Vision gradient methods image classification learning (artificial intelligence) neural nets unsupervised learning |
title | Control the number of skip‐connects to improve robustness of the NAS algorithm |
title_full | Control the number of skip‐connects to improve robustness of the NAS algorithm |
title_fullStr | Control the number of skip‐connects to improve robustness of the NAS algorithm |
title_full_unstemmed | Control the number of skip‐connects to improve robustness of the NAS algorithm |
title_short | Control the number of skip‐connects to improve robustness of the NAS algorithm |
title_sort | control the number of skip connects to improve robustness of the nas algorithm |
topic | gradient methods image classification learning (artificial intelligence) neural nets unsupervised learning |
url | https://doi.org/10.1049/cvi2.12036 |
work_keys_str_mv | AT baofengzhang controlthenumberofskipconnectstoimproverobustnessofthenasalgorithm AT guoqiangzhou controlthenumberofskipconnectstoimproverobustnessofthenasalgorithm |