Training-free neural architecture search: A review
The goal of neural architecture search (NAS) is to either downsize the neural architecture and model of a deep neural network (DNN), adjust a neural architecture to improve its end result, or even speed up the whole training process. Such improvements make it possible to generate or install the mode...
Main Authors: | Meng-Ting Wu, Chun-Wei Tsai |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2024-02-01
|
Series: | ICT Express |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2405959523001443 |
Similar Items
-
A Feature Fusion Based Indicator for Training-Free Neural Architecture Search
by: Linh-Tam Tran, et al.
Published: (2021-01-01) -
Low Cost Evolutionary Neural Architecture Search (LENAS) Applied to Traffic Forecasting
by: Daniel Klosa, et al.
Published: (2023-07-01) -
Efficient and Lightweight Visual Tracking with Differentiable Neural Architecture Search
by: Peng Gao, et al.
Published: (2023-08-01) -
Neural Architecture Search for Lightweight Neural Network in Food Recognition
by: Ren Zhang Tan, et al.
Published: (2021-05-01) -
TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search
by: Heechul Lim, et al.
Published: (2022-01-01)