GhostNeXt: Rethinking Module Configurations for Efficient Model Design
Despite the continuous development of convolutional neural networks, it remains a challenge to achieve performance improvement with fewer parameters and floating point operations (FLOPs) as a light-weight model. In particular, excessive expressive power on a module is a crucial cause of skyrocketing...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-03-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/13/5/3301 |
_version_ | 1797615674167656448 |
---|---|
author | Kiseong Hong Gyeong-hyeon Kim Eunwoo Kim |
author_facet | Kiseong Hong Gyeong-hyeon Kim Eunwoo Kim |
author_sort | Kiseong Hong |
collection | DOAJ |
description | Despite the continuous development of convolutional neural networks, it remains a challenge to achieve performance improvement with fewer parameters and floating point operations (FLOPs) as a light-weight model. In particular, excessive expressive power on a module is a crucial cause of skyrocketing the computational cost of the entire network. We argue that it is necessary to optimize the entire network by optimizing single modules or blocks of the network. Therefore, we propose GhostNeXt, a promising alternative to GhostNet, by adjusting the module configuration inside the Ghost block. We introduce a controller to select channel operations of the module dynamically. It holds a plug-and-play component that is more useful than the existing approach. Experiments on several classification tasks demonstrate that the proposed method is a better alternative to convolution layers in baseline models. GhostNeXt achieves competitive recognition performance compared to GhostNet and other popular models while reducing computational costs on the benchmark datasets. |
first_indexed | 2024-03-11T07:30:02Z |
format | Article |
id | doaj.art-b02ba293372d40ceb62e585d101aed04 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-11T07:30:02Z |
publishDate | 2023-03-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-b02ba293372d40ceb62e585d101aed042023-11-17T07:22:00ZengMDPI AGApplied Sciences2076-34172023-03-01135330110.3390/app13053301GhostNeXt: Rethinking Module Configurations for Efficient Model DesignKiseong Hong0Gyeong-hyeon Kim1Eunwoo Kim2Department of Artificial Intelligence, Chung-Ang University, Seoul 06974, Republic of KoreaSchool of Computer Science and Engineering, Chung-Ang University, Seoul 06974, Republic of KoreaDepartment of Artificial Intelligence, Chung-Ang University, Seoul 06974, Republic of KoreaDespite the continuous development of convolutional neural networks, it remains a challenge to achieve performance improvement with fewer parameters and floating point operations (FLOPs) as a light-weight model. In particular, excessive expressive power on a module is a crucial cause of skyrocketing the computational cost of the entire network. We argue that it is necessary to optimize the entire network by optimizing single modules or blocks of the network. Therefore, we propose GhostNeXt, a promising alternative to GhostNet, by adjusting the module configuration inside the Ghost block. We introduce a controller to select channel operations of the module dynamically. It holds a plug-and-play component that is more useful than the existing approach. Experiments on several classification tasks demonstrate that the proposed method is a better alternative to convolution layers in baseline models. GhostNeXt achieves competitive recognition performance compared to GhostNet and other popular models while reducing computational costs on the benchmark datasets.https://www.mdpi.com/2076-3417/13/5/3301module configurationresource-efficient networknetwork design |
spellingShingle | Kiseong Hong Gyeong-hyeon Kim Eunwoo Kim GhostNeXt: Rethinking Module Configurations for Efficient Model Design Applied Sciences module configuration resource-efficient network network design |
title | GhostNeXt: Rethinking Module Configurations for Efficient Model Design |
title_full | GhostNeXt: Rethinking Module Configurations for Efficient Model Design |
title_fullStr | GhostNeXt: Rethinking Module Configurations for Efficient Model Design |
title_full_unstemmed | GhostNeXt: Rethinking Module Configurations for Efficient Model Design |
title_short | GhostNeXt: Rethinking Module Configurations for Efficient Model Design |
title_sort | ghostnext rethinking module configurations for efficient model design |
topic | module configuration resource-efficient network network design |
url | https://www.mdpi.com/2076-3417/13/5/3301 |
work_keys_str_mv | AT kiseonghong ghostnextrethinkingmoduleconfigurationsforefficientmodeldesign AT gyeonghyeonkim ghostnextrethinkingmoduleconfigurationsforefficientmodeldesign AT eunwookim ghostnextrethinkingmoduleconfigurationsforefficientmodeldesign |