FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights

Recent analyses by leading national wildfire and emergency monitoring agencies have highlighted an alarming trend: the impact of wildfire devastation has escalated to nearly three times that of a decade ago. To address this challenge, we propose FireDetXplainer (FDX), a robust deep-learning model th...

Full description

Bibliographic Details
Main Authors: Syeda Fiza Rubab, Arslan Abdul Ghaffar, Gyu Sang Choi
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10486908/
_version_ 1827281455098626048
author Syeda Fiza Rubab
Arslan Abdul Ghaffar
Gyu Sang Choi
author_facet Syeda Fiza Rubab
Arslan Abdul Ghaffar
Gyu Sang Choi
author_sort Syeda Fiza Rubab
collection DOAJ
description Recent analyses by leading national wildfire and emergency monitoring agencies have highlighted an alarming trend: the impact of wildfire devastation has escalated to nearly three times that of a decade ago. To address this challenge, we propose FireDetXplainer (FDX), a robust deep-learning model that enhances the interpretability often lacking in current solutions. FDX employs an innovative approach, combining transfer learning and fine-tuning methodologies with the Learning without Forgetting (LwF) framework. A key aspect of our methodology is the utilization of the pre-trained MobileNetV3 model, renowned for its efficiency in image classification tasks. Through strategic adaptation and augmentation, we have achieved an exceptional classification accuracy of 99.91%. The model is further refined with convolutional blocks and advanced image pre-processing techniques, contributing to this high level of precision. Leveraging diverse datasets from Kaggle and Mendeley, FireDetXplainer incorporates Explainable AI (XAI) tools such as Gradient Weighted Class Activation Map (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME) for comprehensive result interpretation. Our extensive experimental results demonstrate that FireDetXplainer not only outperforms existing state-of-the-art models but does so with remarkable accuracy, making it a highly effective solution for interpretable image classification in wildfire management.
first_indexed 2024-04-24T09:01:35Z
format Article
id doaj.art-2a972c28b15a47019d55a73aa87f15df
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-24T09:01:35Z
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-2a972c28b15a47019d55a73aa87f15df2024-04-15T23:00:31ZengIEEEIEEE Access2169-35362024-01-0112523785238910.1109/ACCESS.2024.338365310486908FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI InsightsSyeda Fiza Rubab0https://orcid.org/0009-0001-3632-7932Arslan Abdul Ghaffar1https://orcid.org/0009-0004-4744-8221Gyu Sang Choi2https://orcid.org/0000-0002-0854-768XDepartment of Information and Communication Engineering, Yeungnam University, Gyeongsan, Republic of KoreaDepartment of Information and Communication Engineering, Yeungnam University, Gyeongsan, Republic of KoreaDepartment of Information and Communication Engineering, Yeungnam University, Gyeongsan, Republic of KoreaRecent analyses by leading national wildfire and emergency monitoring agencies have highlighted an alarming trend: the impact of wildfire devastation has escalated to nearly three times that of a decade ago. To address this challenge, we propose FireDetXplainer (FDX), a robust deep-learning model that enhances the interpretability often lacking in current solutions. FDX employs an innovative approach, combining transfer learning and fine-tuning methodologies with the Learning without Forgetting (LwF) framework. A key aspect of our methodology is the utilization of the pre-trained MobileNetV3 model, renowned for its efficiency in image classification tasks. Through strategic adaptation and augmentation, we have achieved an exceptional classification accuracy of 99.91%. The model is further refined with convolutional blocks and advanced image pre-processing techniques, contributing to this high level of precision. Leveraging diverse datasets from Kaggle and Mendeley, FireDetXplainer incorporates Explainable AI (XAI) tools such as Gradient Weighted Class Activation Map (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME) for comprehensive result interpretation. Our extensive experimental results demonstrate that FireDetXplainer not only outperforms existing state-of-the-art models but does so with remarkable accuracy, making it a highly effective solution for interpretable image classification in wildfire management.https://ieeexplore.ieee.org/document/10486908/Deep learningexplainable AI (XAI)transfer learningwildfire detection
spellingShingle Syeda Fiza Rubab
Arslan Abdul Ghaffar
Gyu Sang Choi
FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights
IEEE Access
Deep learning
explainable AI (XAI)
transfer learning
wildfire detection
title FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights
title_full FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights
title_fullStr FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights
title_full_unstemmed FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights
title_short FireDetXplainer: Decoding Wildfire Detection With Transparency and Explainable AI Insights
title_sort firedetxplainer decoding wildfire detection with transparency and explainable ai insights
topic Deep learning
explainable AI (XAI)
transfer learning
wildfire detection
url https://ieeexplore.ieee.org/document/10486908/
work_keys_str_mv AT syedafizarubab firedetxplainerdecodingwildfiredetectionwithtransparencyandexplainableaiinsights
AT arslanabdulghaffar firedetxplainerdecodingwildfiredetectionwithtransparencyandexplainableaiinsights
AT gyusangchoi firedetxplainerdecodingwildfiredetectionwithtransparencyandexplainableaiinsights