NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery
Recent advances in machine learning and computer vision have enabled increased automation in benthic habitat mapping through airborne and satellite remote sensing. Here, we applied deep learning and neural network architectures in NASA NeMO-Net, a novel neural multimodal observation and training net...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9174766/ |
_version_ | 1798032790068920320 |
---|---|
author | Alan S. Li Ved Chirayath Michal Segal-Rozenhaimer Juan L. Torres-Perez Jarrett van den Bergh |
author_facet | Alan S. Li Ved Chirayath Michal Segal-Rozenhaimer Juan L. Torres-Perez Jarrett van den Bergh |
author_sort | Alan S. Li |
collection | DOAJ |
description | Recent advances in machine learning and computer vision have enabled increased automation in benthic habitat mapping through airborne and satellite remote sensing. Here, we applied deep learning and neural network architectures in NASA NeMO-Net, a novel neural multimodal observation and training network for global habitat mapping of shallow benthic tropical marine systems. These ecosystems, particularly coral reefs, are undergoing rapid changes as a result of increasing ocean temperatures, acidification, and pollution, among other stressors. Remote sensing from air and space has been the primary method in which changes are assessed within these important, often remote, ecosystems at a global scale. However, such global datasets often suffer from large spectral variances due to the time of observation, atmospheric effects, water column properties, and heterogeneous instruments and calibrations. To address these challenges, we developed an object-based fully convolutional network (FCN) to improve upon the spatial-spectral classification problem inherent in multimodal datasets. We showed that with training upon augmented data in conjunction with classical methods, such as K-nearest neighbors, we were able to achieve better overall classification and segmentation results. This suggests FCNs are able to effectively identify the relative applicable spectral and spatial spaces within an image, whereas pixel-based classical methods excel at classification within those identified spaces. Our spectrally invariant results, based on minimally preprocessed WorldView-2 and Planet satellite imagery, show a total accuracy of approximately 85% and 80%, respectively, over nine classes when trained and tested upon a chain of Fijian islands imaged under highly variable day-to-day spectral inputs. |
first_indexed | 2024-04-11T20:19:36Z |
format | Article |
id | doaj.art-af3aae93efe844f69eec1a05b1d2f857 |
institution | Directory Open Access Journal |
issn | 2151-1535 |
language | English |
last_indexed | 2024-04-11T20:19:36Z |
publishDate | 2020-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing |
spelling | doaj.art-af3aae93efe844f69eec1a05b1d2f8572022-12-22T04:04:52ZengIEEEIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing2151-15352020-01-01135115513310.1109/JSTARS.2020.30187199174766NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing ImageryAlan S. Li0https://orcid.org/0000-0001-9097-0788Ved Chirayath1Michal Segal-Rozenhaimer2Juan L. Torres-Perez3Jarrett van den Bergh4Earth Science Division (Code SG), NASA Ames Research Center, Mountain View, CA, USAEarth Science Division (Code SG), NASA Ames Research Center, Mountain View, CA, USAEarth Science Division (Code SG), NASA Ames Research Center, Mountain View, CA, USAEarth Science Division (Code SG), NASA Ames Research Center, Mountain View, CA, USAEarth Science Division (Code SG), NASA Ames Research Center, Mountain View, CA, USARecent advances in machine learning and computer vision have enabled increased automation in benthic habitat mapping through airborne and satellite remote sensing. Here, we applied deep learning and neural network architectures in NASA NeMO-Net, a novel neural multimodal observation and training network for global habitat mapping of shallow benthic tropical marine systems. These ecosystems, particularly coral reefs, are undergoing rapid changes as a result of increasing ocean temperatures, acidification, and pollution, among other stressors. Remote sensing from air and space has been the primary method in which changes are assessed within these important, often remote, ecosystems at a global scale. However, such global datasets often suffer from large spectral variances due to the time of observation, atmospheric effects, water column properties, and heterogeneous instruments and calibrations. To address these challenges, we developed an object-based fully convolutional network (FCN) to improve upon the spatial-spectral classification problem inherent in multimodal datasets. We showed that with training upon augmented data in conjunction with classical methods, such as K-nearest neighbors, we were able to achieve better overall classification and segmentation results. This suggests FCNs are able to effectively identify the relative applicable spectral and spatial spaces within an image, whereas pixel-based classical methods excel at classification within those identified spaces. Our spectrally invariant results, based on minimally preprocessed WorldView-2 and Planet satellite imagery, show a total accuracy of approximately 85% and 80%, respectively, over nine classes when trained and tested upon a chain of Fijian islands imaged under highly variable day-to-day spectral inputs.https://ieeexplore.ieee.org/document/9174766/Convolutional neural network (CNN)deep learningimage segmentationmultispectral imaging |
spellingShingle | Alan S. Li Ved Chirayath Michal Segal-Rozenhaimer Juan L. Torres-Perez Jarrett van den Bergh NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing Convolutional neural network (CNN) deep learning image segmentation multispectral imaging |
title | NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery |
title_full | NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery |
title_fullStr | NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery |
title_full_unstemmed | NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery |
title_short | NASA NeMO-Net's Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery |
title_sort | nasa nemo net x0027 s convolutional neural network mapping marine habitats with spectrally heterogeneous remote sensing imagery |
topic | Convolutional neural network (CNN) deep learning image segmentation multispectral imaging |
url | https://ieeexplore.ieee.org/document/9174766/ |
work_keys_str_mv | AT alansli nasanemonetx0027sconvolutionalneuralnetworkmappingmarinehabitatswithspectrallyheterogeneousremotesensingimagery AT vedchirayath nasanemonetx0027sconvolutionalneuralnetworkmappingmarinehabitatswithspectrallyheterogeneousremotesensingimagery AT michalsegalrozenhaimer nasanemonetx0027sconvolutionalneuralnetworkmappingmarinehabitatswithspectrallyheterogeneousremotesensingimagery AT juanltorresperez nasanemonetx0027sconvolutionalneuralnetworkmappingmarinehabitatswithspectrallyheterogeneousremotesensingimagery AT jarrettvandenbergh nasanemonetx0027sconvolutionalneuralnetworkmappingmarinehabitatswithspectrallyheterogeneousremotesensingimagery |