Performance prediction based on neural architecture features
Neural Architecture Search (NAS) usually requires to train quantities of candidate neural networks on a dataset for choosing a high-performance network architecture and optimising hyperparameters, which is very time consuming and computationally expensive. In order to resolve the issue, the authors...
Main Authors: | Duo Long, Shizhou Zhang, Yanning Zhang |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2020-04-01
|
Series: | Cognitive Computation and Systems |
Subjects: | |
Online Access: | https://digital-library.theiet.org/content/journals/10.1049/ccs.2019.0024 |
Similar Items
-
Intelligent fault diagnosis of rotating machinery using lightweight network with modified tree‐structured parzen estimators
by: Jingkang Liang, et al.
Published: (2022-09-01) -
Neural Architecture Search Benchmarks: Insights and Survey
by: Krishna Teja Chitty-Venkata, et al.
Published: (2023-01-01) -
GreenNAS: A Green Approach to the Hyperparameters Tuning in Deep Learning
by: Giorgia Franchini
Published: (2024-03-01) -
RARTS: An Efficient First-Order Relaxed Architecture Search Method
by: Fanghui Xue, et al.
Published: (2022-01-01) -
TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search
by: Heechul Lim, et al.
Published: (2022-01-01)