Field Patch Extraction Based on High-Resolution Imaging and U<sup>2</sup>-Net++ Convolutional Neural Networks

Accurate extraction of farmland boundaries is crucial for improving the efficiency of farmland surveys, achieving precise agricultural management, enhancing farmers’ production conditions, protecting the ecological environment, and promoting local economic development. Remote sensing and deep learni...

Full description

Bibliographic Details
Main Authors: Chen Long, Song Wenlong, Sun Tao, Lu Yizhu, Jiang Wei, Liu Jun, Liu Hongjie, Feng Tianshi, Gui Rongjie, Haider Abbas, Meng Lingwei, Lin Shengjie, He Qian
Format: Article
Language:English
Published: MDPI AG 2023-10-01
Series:Remote Sensing
Subjects:
Online Access:https://www.mdpi.com/2072-4292/15/20/4900
Description
Summary:Accurate extraction of farmland boundaries is crucial for improving the efficiency of farmland surveys, achieving precise agricultural management, enhancing farmers’ production conditions, protecting the ecological environment, and promoting local economic development. Remote sensing and deep learning are feasible methods for creating large-scale farmland boundary maps. However, existing neural network models have limitations that restrict the accuracy and reliability of agricultural parcel extraction using remote sensing technology. In this study, we used high-resolution satellite images (2 m, 1 m, and 0.8 m) and the U<sup>2</sup>-Net++ model based on the RSU module, deep separable convolution, and the channel-spatial attention mechanism module to extract different types of fields. Our model exhibited significant improvements in farmland parcel extraction compared with the other models. It achieved an F1-score of 97.13%, which is a 7.36% to 17.63% improvement over older models such as U-Net and FCN and a more than 2% improvement over advanced models such as DeepLabv3+ and U<sup>2</sup>-Net. These results indicate that U<sup>2</sup>-Net++ holds the potential for widespread application in the production of large-scale farmland boundary maps.
ISSN:2072-4292