Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation

IntroductionRemote sensing is defined as data acquisition about an object or a phenomenon related to a geographic location without physical. The use of remote sensing data is expanding rapidly. Researchers have always been interested in accurately classifying land coverage phenomena using multispect...

Full description

Bibliographic Details
Main Authors: M. Saadikhani, M. Maharlooei, M. A. Rostami, M. Edalat
Format: Article
Language:English
Published: Ferdowsi University of Mashhad 2023-12-01
Series:Journal of Agricultural Machinery
Subjects:
Online Access:https://jame.um.ac.ir/article_43118_a57b7aaba03d768dcf49e1713c9ff891.pdf
_version_ 1797404549727649792
author M. Saadikhani
M. Maharlooei
M. A. Rostami
M. Edalat
author_facet M. Saadikhani
M. Maharlooei
M. A. Rostami
M. Edalat
author_sort M. Saadikhani
collection DOAJ
description IntroductionRemote sensing is defined as data acquisition about an object or a phenomenon related to a geographic location without physical. The use of remote sensing data is expanding rapidly. Researchers have always been interested in accurately classifying land coverage phenomena using multispectral images. One of the factors that reduces the accuracy of the classification map is the existence of uneven surfaces and high-altitude areas. The presence of high-altitude points makes it difficult for the sensors to obtain accurate reflection information from the surface of the phenomena. Radar imagery used with the digital elevation model (DEM) is effective for identifying and determining altitude phenomena. Image fusion is a technique that uses two sensors with completely different specifications and takes advantage of both of the sensors' capabilities. In this study, the feasibility of employing the fusion technique to improve the overall accuracy of classifying land coverage phenomena using time series NDVI images of Sentinel 2 satellite imagery and PALSAR radar imagery of ALOS satellite was investigated. Additionally, the results of predicted and measured areas of fields under cultivation of wheat, barley, and canola were studied.Materials and MethodsThirteen Sentinel-2 multispectral satellite images with 10-meter spatial resolution from the Bajgah region in Fars province, Iran from Nov 2018 to June 2019 were downloaded at the Level-1C processing level to classify the cultivated lands and other phenomena. Ground truth data were collected through several field visits using handheld GPS to pinpoint different phenomena in the region of study. The seven classes of distinguished land coverage and phenomena include (1) Wheat, (2) Barley, (3) Canola, (4) Tree, (5) Residential regions, (6) Soil, and (7) others. After the preprocessing operations such as radiometric and atmospheric corrections using predefined built-in algorithms recommended by other researchers in ENVI 5.3, and cropping the region of interest (ROI) from the original image, the Normalized Difference Vegetation Index (NDVI) was calculated for each image. The DEM was obtained from the PALSAR sensor radar image with the 12.5-meter spatial resolution of the ALOS satellite. After preprocessing and cropping the ROI, a binary mask of radar images was created using threshold values of altitudes between 1764 and 1799 meters above the sea level in ENVI 5.3. The NDVI time series was then composed of all 13 images and integrated with radar images using the pixel-level integration method. The purpose of this process was to remove the high-altitude points in the study area that would reduce the accuracy of the classification map. The image fusion process was also performed using ENVI 5.3. The support Vector Machine (SVM) classification method was employed to train the classifier for both fused and unfused images as suggested by other researchers.To evaluate the effectiveness of image fusion, Commission and Omission errors, and the Overall accuracy were calculated using a Confusion matrix. To study the accuracy of the estimated area under cultivation of main crops in the region versus the actual measured values of the area, regression equation and percentage of difference were calculated.Results and DiscussionVisual inspection of classified output maps shows the difference between the fused and unfused images in classifying similar classes such as buildings and structures versus regions covered with bare soil and lands under cultivation versus natural vegetation in high altitude points. Statistical metrics verified these visual evaluations.  The SVM algorithm in fusion mode resulted in 98.06% accuracy and 0.97 kappa coefficient, 7.5% higher accuracy than the unfused images.As stated earlier, the similarities between the soil class (stones and rocks in the mountains) and manmade buildings and infrastructures increase omission error and misclassification in unfused image classification. The same misclassification occurred for the visually similar croplands and shallow vegetation at high altitude points. These results were consistence with previous literature that reported the same misclassification in analogous classes. The predicted area under cultivation of wheat and barley were overestimated by 3 and 1.5 percent, respectively. However, for canola, the area was underestimated by 3.5 percent.ConclusionThe main focus of this study was employing the image fusion technique and improving the classification accuracy of satellite imagery. Integration of PALSAR sensor data from ALOS radar satellite with multi-spectral imagery of Sentinel 2 satellite enhanced the classification accuracy of output maps by eliminating the high-altitude points and biases due to rocks and natural vegetation at hills and mountains. Statistical metrics such as the overall accuracy, Kappa coefficient, and commission and omission errors confirmed the visual findings of the fused vs. unfused classification maps.
first_indexed 2024-03-09T02:56:37Z
format Article
id doaj.art-f7fb8e8dfc8a43ec96dcb9747251c404
institution Directory Open Access Journal
issn 2228-6829
2423-3943
language English
last_indexed 2024-03-09T02:56:37Z
publishDate 2023-12-01
publisher Ferdowsi University of Mashhad
record_format Article
series Journal of Agricultural Machinery
spelling doaj.art-f7fb8e8dfc8a43ec96dcb9747251c4042023-12-05T05:00:36ZengFerdowsi University of MashhadJournal of Agricultural Machinery2228-68292423-39432023-12-0113449350810.22067/jam.2022.78446.112343118Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops CultivationM. Saadikhani0M. Maharlooei1M. A. Rostami2M. Edalat3MSc Student in Biosystems Mechanical Engineering, Department of Biosystems Engineering, Shahid Bahonar University of Kerman, Kerman, IranDepartment of Biosystems Engineering, Shahid Bahonar University of Kerman, Kerman, IranAgricultural Engineering Research Department, Fars Agricultural and Resource Research and Education Center, AREEO, Shiraz, IranDepartment of Agronomy and Plant Breeding, Shiraz University, Shiraz, IranIntroductionRemote sensing is defined as data acquisition about an object or a phenomenon related to a geographic location without physical. The use of remote sensing data is expanding rapidly. Researchers have always been interested in accurately classifying land coverage phenomena using multispectral images. One of the factors that reduces the accuracy of the classification map is the existence of uneven surfaces and high-altitude areas. The presence of high-altitude points makes it difficult for the sensors to obtain accurate reflection information from the surface of the phenomena. Radar imagery used with the digital elevation model (DEM) is effective for identifying and determining altitude phenomena. Image fusion is a technique that uses two sensors with completely different specifications and takes advantage of both of the sensors' capabilities. In this study, the feasibility of employing the fusion technique to improve the overall accuracy of classifying land coverage phenomena using time series NDVI images of Sentinel 2 satellite imagery and PALSAR radar imagery of ALOS satellite was investigated. Additionally, the results of predicted and measured areas of fields under cultivation of wheat, barley, and canola were studied.Materials and MethodsThirteen Sentinel-2 multispectral satellite images with 10-meter spatial resolution from the Bajgah region in Fars province, Iran from Nov 2018 to June 2019 were downloaded at the Level-1C processing level to classify the cultivated lands and other phenomena. Ground truth data were collected through several field visits using handheld GPS to pinpoint different phenomena in the region of study. The seven classes of distinguished land coverage and phenomena include (1) Wheat, (2) Barley, (3) Canola, (4) Tree, (5) Residential regions, (6) Soil, and (7) others. After the preprocessing operations such as radiometric and atmospheric corrections using predefined built-in algorithms recommended by other researchers in ENVI 5.3, and cropping the region of interest (ROI) from the original image, the Normalized Difference Vegetation Index (NDVI) was calculated for each image. The DEM was obtained from the PALSAR sensor radar image with the 12.5-meter spatial resolution of the ALOS satellite. After preprocessing and cropping the ROI, a binary mask of radar images was created using threshold values of altitudes between 1764 and 1799 meters above the sea level in ENVI 5.3. The NDVI time series was then composed of all 13 images and integrated with radar images using the pixel-level integration method. The purpose of this process was to remove the high-altitude points in the study area that would reduce the accuracy of the classification map. The image fusion process was also performed using ENVI 5.3. The support Vector Machine (SVM) classification method was employed to train the classifier for both fused and unfused images as suggested by other researchers.To evaluate the effectiveness of image fusion, Commission and Omission errors, and the Overall accuracy were calculated using a Confusion matrix. To study the accuracy of the estimated area under cultivation of main crops in the region versus the actual measured values of the area, regression equation and percentage of difference were calculated.Results and DiscussionVisual inspection of classified output maps shows the difference between the fused and unfused images in classifying similar classes such as buildings and structures versus regions covered with bare soil and lands under cultivation versus natural vegetation in high altitude points. Statistical metrics verified these visual evaluations.  The SVM algorithm in fusion mode resulted in 98.06% accuracy and 0.97 kappa coefficient, 7.5% higher accuracy than the unfused images.As stated earlier, the similarities between the soil class (stones and rocks in the mountains) and manmade buildings and infrastructures increase omission error and misclassification in unfused image classification. The same misclassification occurred for the visually similar croplands and shallow vegetation at high altitude points. These results were consistence with previous literature that reported the same misclassification in analogous classes. The predicted area under cultivation of wheat and barley were overestimated by 3 and 1.5 percent, respectively. However, for canola, the area was underestimated by 3.5 percent.ConclusionThe main focus of this study was employing the image fusion technique and improving the classification accuracy of satellite imagery. Integration of PALSAR sensor data from ALOS radar satellite with multi-spectral imagery of Sentinel 2 satellite enhanced the classification accuracy of output maps by eliminating the high-altitude points and biases due to rocks and natural vegetation at hills and mountains. Statistical metrics such as the overall accuracy, Kappa coefficient, and commission and omission errors confirmed the visual findings of the fused vs. unfused classification maps.https://jame.um.ac.ir/article_43118_a57b7aaba03d768dcf49e1713c9ff891.pdfconfusion matrixnormalized difference vegetation index (ndvi)radar imagesentinel 2 satellitesupport vector machine
spellingShingle M. Saadikhani
M. Maharlooei
M. A. Rostami
M. Edalat
Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation
Journal of Agricultural Machinery
confusion matrix
normalized difference vegetation index (ndvi)
radar image
sentinel 2 satellite
support vector machine
title Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation
title_full Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation
title_fullStr Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation
title_full_unstemmed Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation
title_short Fusion of Multispectral and Radar Images to Enhance Classification Accuracy and Estimate the Area under Various Crops Cultivation
title_sort fusion of multispectral and radar images to enhance classification accuracy and estimate the area under various crops cultivation
topic confusion matrix
normalized difference vegetation index (ndvi)
radar image
sentinel 2 satellite
support vector machine
url https://jame.um.ac.ir/article_43118_a57b7aaba03d768dcf49e1713c9ff891.pdf
work_keys_str_mv AT msaadikhani fusionofmultispectralandradarimagestoenhanceclassificationaccuracyandestimatetheareaundervariouscropscultivation
AT mmaharlooei fusionofmultispectralandradarimagestoenhanceclassificationaccuracyandestimatetheareaundervariouscropscultivation
AT marostami fusionofmultispectralandradarimagestoenhanceclassificationaccuracyandestimatetheareaundervariouscropscultivation
AT medalat fusionofmultispectralandradarimagestoenhanceclassificationaccuracyandestimatetheareaundervariouscropscultivation