Detection of Commodities Based on Multi-Feature Fusion and Attention Screening by Entropy Function Guidance

Although traditional convolutional neural networks (CNN) have been significantly improved for target detection, they cannot be completely applied to objects with occlusions in commodity detection. Therefore, we propose a target detection method based on an improved YOLOv5 model and an improved atten...

Full description

Bibliographic Details
Main Authors: An Xie, Kai Xie, Hao-Nan Dong, Jian-Biao He
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10225039/
Description
Summary:Although traditional convolutional neural networks (CNN) have been significantly improved for target detection, they cannot be completely applied to objects with occlusions in commodity detection. Therefore, we propose a target detection method based on an improved YOLOv5 model and an improved attention mechanism algorithm is proposed to solve the commodity occlusion problem. This method improves the traditional YOLO deep convolution network, features a more detailed BiFPN layer, and performs lightweight two-way feature fusion, where the multidimensional features of the commodities are convolved and fused, thus improving the overall detection speed and accuracy of the YOLO-R algorithm. Feature entropy is introduced to the attention channel to restrict the threshold value and obtain the global information of the occlusion target. The global information obtained is fused with a bidirectional feature pyramid layer to enhance the robustness of the features. This method could accurately and quickly detect the occluded commodities and the detection accuracy has been greatly improved. Experiments show that the improved YOLO-R model can improve the accuracy and speed of commodity detection, and can achieve good results in objective evaluation. The average accuracy of commodity detection on the self-made product dataset is up to 97.80%, and the detection rate is 22.72F/s. Therefore, the method in this paper has high detection accuracy and fast detection speed.
ISSN:2169-3536