TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search

There is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The e...

Full description

Bibliographic Details
Main Authors: Heechul Lim, Min-Soo Kim
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9845403/
Description
Summary:There is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. For example, types of operations (e.g., convolutions with filter sizes <inline-formula> <tex-math notation="LaTeX">$3\times 3$ </tex-math></inline-formula>) in the search space need to be carefully selected in the existing methods. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods.
ISSN:2169-3536