TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search
There is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The e...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9845403/ |
_version_ | 1818486442251255808 |
---|---|
author | Heechul Lim Min-Soo Kim |
author_facet | Heechul Lim Min-Soo Kim |
author_sort | Heechul Lim |
collection | DOAJ |
description | There is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. For example, types of operations (e.g., convolutions with filter sizes <inline-formula> <tex-math notation="LaTeX">$3\times 3$ </tex-math></inline-formula>) in the search space need to be carefully selected in the existing methods. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods. |
first_indexed | 2024-12-10T16:23:04Z |
format | Article |
id | doaj.art-b64dcdf29779403bafd5c5b508869914 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-10T16:23:04Z |
publishDate | 2022-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-b64dcdf29779403bafd5c5b5088699142022-12-22T01:41:46ZengIEEEIEEE Access2169-35362022-01-0110847908479810.1109/ACCESS.2022.31952089845403TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture SearchHeechul Lim0https://orcid.org/0000-0002-3281-3191Min-Soo Kim1https://orcid.org/0000-0002-5065-0226Department of Information and Communication Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu, South KoreaSchool of Computing, Korea Advanced Institute of Science and Technology, Daejeon, South KoreaThere is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. For example, types of operations (e.g., convolutions with filter sizes <inline-formula> <tex-math notation="LaTeX">$3\times 3$ </tex-math></inline-formula>) in the search space need to be carefully selected in the existing methods. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods.https://ieeexplore.ieee.org/document/9845403/Neural architecture searchconvolutional neural networkdeep learning |
spellingShingle | Heechul Lim Min-Soo Kim TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search IEEE Access Neural architecture search convolutional neural network deep learning |
title | TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search |
title_full | TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search |
title_fullStr | TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search |
title_full_unstemmed | TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search |
title_short | TENAS: Using Taylor Expansion and Channel-Level Skip Connection for Neural Architecture Search |
title_sort | tenas using taylor expansion and channel level skip connection for neural architecture search |
topic | Neural architecture search convolutional neural network deep learning |
url | https://ieeexplore.ieee.org/document/9845403/ |
work_keys_str_mv | AT heechullim tenasusingtaylorexpansionandchannellevelskipconnectionforneuralarchitecturesearch AT minsookim tenasusingtaylorexpansionandchannellevelskipconnectionforneuralarchitecturesearch |