Land use/land cover change detection combining automatic processing and visual interpretation

This article presents a hybrid classification method combining image segmentation, GIS analysis, and visual interpretation, and its application to elaborate a multi-date cartographic database with 23 land use/cover (LUC) classes using SPOT 5 imagery for the Mexican state of Michoacan (~60,000 km2)....

Full description

Bibliographic Details
Main Authors: Jean-François Mas, Richard Lemoine-Rodríguez, Rafael González-López, Jairo López-Sánchez, Andrés Piña-Garduño, Evelyn Herrera-Flores
Format: Article
Language:English
Published: Taylor & Francis Group 2017-01-01
Series:European Journal of Remote Sensing
Subjects:
Online Access:http://dx.doi.org/10.1080/22797254.2017.1387505
Description
Summary:This article presents a hybrid classification method combining image segmentation, GIS analysis, and visual interpretation, and its application to elaborate a multi-date cartographic database with 23 land use/cover (LUC) classes using SPOT 5 imagery for the Mexican state of Michoacan (~60,000 km2). First, the resolution of an existing 1:100,000 LUC map produced through visual interpretation of 2007 SPOT images was improved. 2007 SPOT images were segmented, and each segment received the “majority” LUC category from the 1:100,000 map. Segments were characterized from the images (spectral indices) and the map (LUC class). A multivariate trimming was applied to detect “uncertain” segments presenting discrepancy between their spectral response and the LUC class assigned from the map. For these uncertain segments, a category was determined by digital classification, but a definitive category was assigned through visual interpretation. Finally, accuracy of the resulting LUC map was assessed. The same procedure was applied to downgrade (2004) and to update (2014) the map. The implemented method enabled us to improve the scale of an existing 2007 LUC map and to detect land use/cover changes in previous (downgrading) and later (updating) dates with an overall accuracy of 83.3% ± 3.1%.
ISSN:2279-7254