Study of a QueryPNet Model for Accurate Detection and Segmentation of Goose Body Edge Contours

With the rapid development of computer vision, the application of computer vision to precision farming in animal husbandry is currently a hot research topic. Due to the scale of goose breeding continuing to expand, there are higher requirements for the efficiency of goose farming. To achieve precisi...

Full description

Bibliographic Details
Main Authors: Jiao Li, Houcheng Su, Xingze Zheng, Yixin Liu, Ruoran Zhou, Linghui Xu, Qinli Liu, Daixian Liu, Zhiling Wang, Xuliang Duan
Format: Article
Language:English
Published: MDPI AG 2022-10-01
Series:Animals
Subjects:
Online Access:https://www.mdpi.com/2076-2615/12/19/2653
Description
Summary:With the rapid development of computer vision, the application of computer vision to precision farming in animal husbandry is currently a hot research topic. Due to the scale of goose breeding continuing to expand, there are higher requirements for the efficiency of goose farming. To achieve precision animal husbandry and to avoid human influence on breeding, real-time automated monitoring methods have been used in this area. To be specific, on the basis of instance segmentation, the activities of individual geese are accurately detected, counted, and analyzed, which is effective for achieving traceability of the condition of the flock and reducing breeding costs. We trained QueryPNet, an advanced model, which could effectively perform segmentation and extraction of geese flock. Meanwhile, we proposed a novel neck module that improved the feature pyramid structure, making feature fusion more effective for both target detection and instance individual segmentation. At the same time, the number of model parameters was reduced by a rational design. This solution was tested on 639 datasets collected and labeled on specially created free-range goose farms. With the occlusion of vegetation and litters, the accuracies of the target detection and instance segmentation reached 0.963 (mAP@0.5) and 0.963 (mAP@0.5), respectively.
ISSN:2076-2615