Summary: | Neural Architecture Search Without Training (NASWOT) has been proposed recently to replace the conventional Neural Architecture Search (NAS). Pioneer works only deploy one or two indicator(s) to search. Nevertheless, the quantitative assessment for indicators is not fully studied and evaluated. In this paper, we first review several indicators, which are used to evaluate the network in a training-free manner, including the correlation of Jacobian, the output sensitivity, the number of linear regions, and the condition number of the neural tangent kernel. Our observation is that each indicator is responsible for characterizing a network in a specific aspect and there is no single indicator that achieves good performance in all cases, e.g. highly correlated with the test accuracy. This motivated us to develop a novel indicator where all properties of a network are taken into account. To obtain better indicator that can consider various characteristics of networks in a harmonized form, we propose a Fusion Indicator (FI). Specifically, the proposed FI is formed by combining multiple indicators in a weighted sum manner. We minimize the mean squared error loss between the predicted and actual accuracy of networks to acquire the weights. Moreover, as the conventional training-free NAS researches used limited metrics to evaluate the quality of indicators, we introduce more desirable metrics that can evaluate the quality of training-free NAS indicator in terms of fidelity, correlation and rank-order similarity between the predicted quality value and actual accuracy of networks. That is, we introduce the Pearson Linear Coefficient Correlation (PLCC), the Root Mean Square Error (RMSE), the Spearman Rank-Order Correlation Coefficient (SROCC), and Kendall Rank-Order Correlation Coefficient (KROCC). Extensive experiments on NAS-Bench-101 and NAS-Bench-201 demonstrate the effectiveness of our FI, outperforming existing methods by a large margin.
|