Example-based explainable AI and its application for remote sensing image classification

We present a method of explainable artificial intelligence (XAI), “What I Know (WIK)”, to provide additional information to verify the reliability of a deep learning model by showing an example of an instance in a training dataset that is similar to the input data to be inferred and demonstrate it i...

Full description

Bibliographic Details
Main Authors: Shin-nosuke Ishikawa, Masato Todo, Masato Taki, Yasunobu Uchiyama, Kazunari Matsunaga, Peihsuan Lin, Taiki Ogihara, Masao Yasui
Format: Article
Language:English
Published: Elsevier 2023-04-01
Series:International Journal of Applied Earth Observations and Geoinformation
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1569843223000377
_version_ 1797842822267666432
author Shin-nosuke Ishikawa
Masato Todo
Masato Taki
Yasunobu Uchiyama
Kazunari Matsunaga
Peihsuan Lin
Taiki Ogihara
Masao Yasui
author_facet Shin-nosuke Ishikawa
Masato Todo
Masato Taki
Yasunobu Uchiyama
Kazunari Matsunaga
Peihsuan Lin
Taiki Ogihara
Masao Yasui
author_sort Shin-nosuke Ishikawa
collection DOAJ
description We present a method of explainable artificial intelligence (XAI), “What I Know (WIK)”, to provide additional information to verify the reliability of a deep learning model by showing an example of an instance in a training dataset that is similar to the input data to be inferred and demonstrate it in a remote sensing image classification task. One of the expected roles of XAI methods is verifying whether inferences of a trained machine learning model are valid for an application, and it is an important factor that what datasets are used for training the model as well as the model architecture. Our data-centric approach can help determine whether the training dataset is sufficient for each inference by checking the selected example data. If the selected example looks similar to the input data, we can confirm that the model was not trained on a dataset with a feature distribution far from the feature of the input data. With this method, the criteria for selecting an example are not merely data similarity with the input data but also data similarity in the context of the model task. Using a remote sensing image dataset from the Sentinel-2 satellite, the concept was successfully demonstrated with reasonably selected examples. This method can be applied to various machine-learning tasks, including classification and regression.
first_indexed 2024-04-09T16:54:29Z
format Article
id doaj.art-1cbd639d34184560af5d3b5421b0b10e
institution Directory Open Access Journal
issn 1569-8432
language English
last_indexed 2024-04-09T16:54:29Z
publishDate 2023-04-01
publisher Elsevier
record_format Article
series International Journal of Applied Earth Observations and Geoinformation
spelling doaj.art-1cbd639d34184560af5d3b5421b0b10e2023-04-21T06:41:02ZengElsevierInternational Journal of Applied Earth Observations and Geoinformation1569-84322023-04-01118103215Example-based explainable AI and its application for remote sensing image classificationShin-nosuke Ishikawa0Masato Todo1Masato Taki2Yasunobu Uchiyama3Kazunari Matsunaga4Peihsuan Lin5Taiki Ogihara6Masao Yasui7Graduate School of Artificial Intelligence and Science, Rikkyo University, Tokyo 171-8501, Japan; Strategic Digital Business Unit, Mamezou Co., Ltd., Tokyo 163-0434, Japan; Corresponding author at: Graduate School of Artificial Intelligence and Science, Rikkyo University, Tokyo 171-8501, Japan.Strategic Digital Business Unit, Mamezou Co., Ltd., Tokyo 163-0434, JapanGraduate School of Artificial Intelligence and Science, Rikkyo University, Tokyo 171-8501, JapanGraduate School of Artificial Intelligence and Science, Rikkyo University, Tokyo 171-8501, JapanStrategic Digital Business Unit, Mamezou Co., Ltd., Tokyo 163-0434, JapanStrategic Digital Business Unit, Mamezou Co., Ltd., Tokyo 163-0434, JapanStrategic Digital Business Unit, Mamezou Co., Ltd., Tokyo 163-0434, JapanMamezou Co., Ltd., Tokyo 163-0434, JapanWe present a method of explainable artificial intelligence (XAI), “What I Know (WIK)”, to provide additional information to verify the reliability of a deep learning model by showing an example of an instance in a training dataset that is similar to the input data to be inferred and demonstrate it in a remote sensing image classification task. One of the expected roles of XAI methods is verifying whether inferences of a trained machine learning model are valid for an application, and it is an important factor that what datasets are used for training the model as well as the model architecture. Our data-centric approach can help determine whether the training dataset is sufficient for each inference by checking the selected example data. If the selected example looks similar to the input data, we can confirm that the model was not trained on a dataset with a feature distribution far from the feature of the input data. With this method, the criteria for selecting an example are not merely data similarity with the input data but also data similarity in the context of the model task. Using a remote sensing image dataset from the Sentinel-2 satellite, the concept was successfully demonstrated with reasonably selected examples. This method can be applied to various machine-learning tasks, including classification and regression.http://www.sciencedirect.com/science/article/pii/S1569843223000377Machine learningDeep learningExplainable artificial intelligenceRemote sensing imagery
spellingShingle Shin-nosuke Ishikawa
Masato Todo
Masato Taki
Yasunobu Uchiyama
Kazunari Matsunaga
Peihsuan Lin
Taiki Ogihara
Masao Yasui
Example-based explainable AI and its application for remote sensing image classification
International Journal of Applied Earth Observations and Geoinformation
Machine learning
Deep learning
Explainable artificial intelligence
Remote sensing imagery
title Example-based explainable AI and its application for remote sensing image classification
title_full Example-based explainable AI and its application for remote sensing image classification
title_fullStr Example-based explainable AI and its application for remote sensing image classification
title_full_unstemmed Example-based explainable AI and its application for remote sensing image classification
title_short Example-based explainable AI and its application for remote sensing image classification
title_sort example based explainable ai and its application for remote sensing image classification
topic Machine learning
Deep learning
Explainable artificial intelligence
Remote sensing imagery
url http://www.sciencedirect.com/science/article/pii/S1569843223000377
work_keys_str_mv AT shinnosukeishikawa examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT masatotodo examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT masatotaki examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT yasunobuuchiyama examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT kazunarimatsunaga examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT peihsuanlin examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT taikiogihara examplebasedexplainableaianditsapplicationforremotesensingimageclassification
AT masaoyasui examplebasedexplainableaianditsapplicationforremotesensingimageclassification