GhostNeXt: Rethinking Module Configurations for Efficient Model Design
Despite the continuous development of convolutional neural networks, it remains a challenge to achieve performance improvement with fewer parameters and floating point operations (FLOPs) as a light-weight model. In particular, excessive expressive power on a module is a crucial cause of skyrocketing...
Main Authors: | Kiseong Hong, Gyeong-hyeon Kim, Eunwoo Kim |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-03-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/13/5/3301 |
Similar Items
-
A Resources-Efficient Configurable Accelerator for Deep Convolutional Neural Networks
by: Xianghong Hu, et al.
Published: (2019-01-01) -
Network security situation assessment based on dual attention mechanism and HHO-ResNeXt
by: Dongmei Zhao, et al.
Published: (2023-12-01) -
Auto-Configuration in Wireless Sensor Networks: A Review
by: Ngoc-Thanh Dinh, et al.
Published: (2019-10-01) -
Reducing Complexity of Server Configuration through Public Cloud Storage
by: Sihyung Lee
Published: (2021-05-01) -
Resource-Efficient Multi-Task Deep Learning Using a Multi-Path Network
by: Soyeon Park, et al.
Published: (2022-01-01)