VLDNet: An Ultra-Lightweight Crop Disease Identification Network

Existing deep learning methods usually adopt deeper and wider network structures to achieve better performance. However, we found that this rule does not apply well to crop disease identification tasks, which inspired us to rethink the design paradigm of disease identification models. Crop diseases...

Celý popis

Podrobná bibliografie
Hlavní autoři: Xiaopeng Li, Yichi Zhang, Yuhan Peng, Shuqin Li
Médium: Článek
Jazyk:English
Vydáno: MDPI AG 2023-07-01
Edice:Agriculture
Témata:
On-line přístup:https://www.mdpi.com/2077-0472/13/8/1482
Popis
Shrnutí:Existing deep learning methods usually adopt deeper and wider network structures to achieve better performance. However, we found that this rule does not apply well to crop disease identification tasks, which inspired us to rethink the design paradigm of disease identification models. Crop diseases belong to fine-grained features and lack obvious patterns. Deeper and wider network structures will cause information loss of features, which will damage identification efficiency. Based on this, this paper designs a very lightweight disease identification network called VLDNet. The basic module VLDBlock of VLDNet extracts intrinsic features through 1 × 1 convolution, and uses cheap linear operations to supplement redundant features to improve feature extraction efficiency. In inference, reparameterization technology is used to further reduce the model size and improve inference speed. VLDNet achieves state-of-the-art model (SOTA) latency-accuracy trade-offs on self-built and public datasets, such as equivalent performance to Swin-Tiny with a parameter size of 0.097 MB and 0.04 G floating point operations (FLOPs), while reducing parameter size and FLOPs by 297 times and 111 times, respectively. In actual testing, VLDNet can recognize 221 images per second, which is far superior to similar accuracy models. This work is expected to further promote the application of deep learning-based crop disease identification methods in practical production.
ISSN:2077-0472