TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search

There is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The e...

Full description

Bibliographic Details
Main Authors: Heechul Lim, Min-Soo Kim
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9845403/