An Explainable Attention Zone Estimation for Level 3 Autonomous Driving

Accurately assessing the driver&#x2019;s situational awareness is crucial in level 3 (<inline-formula> <tex-math notation="LaTeX">$L_{3}$ </tex-math></inline-formula>) autonomous driving, where the driver is in the loop. Estimating the attention zone provides es...

Full description

Bibliographic Details
Main Authors: Roksana Yahyaabadi, Soodeh Nikan
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10233871/
_version_ 1797692861238476800
author Roksana Yahyaabadi
Soodeh Nikan
author_facet Roksana Yahyaabadi
Soodeh Nikan
author_sort Roksana Yahyaabadi
collection DOAJ
description Accurately assessing the driver&#x2019;s situational awareness is crucial in level 3 (<inline-formula> <tex-math notation="LaTeX">$L_{3}$ </tex-math></inline-formula>) autonomous driving, where the driver is in the loop. Estimating the attention zone provides essential information about the drivers&#x2019; on/off-road visual attention and determines their readiness to take over the control from the autonomous agent in complicated situations. This paper proposes a double-phase pipeline to improve the explainability and accuracy of the attention zone estimation using an intermediate gaze regression layer, where the true relationships between the input images and output zone labels are interpretable. The proposed GazeMobileNet, a lightweight deep neural network, in the first phase, achieved state-of-the-art performance in estimating the gaze vector in the MPIIGaze dataset, with MAE of 2.37 degrees. The model was used to extract the corresponding gaze vectors from the LISA V2, which is a driving dataset with the in-cabin attention zone labels. As LISA V2 does not contain gaze vector labels, an unsupervised clustering approach was proposed in the second phase to categorize the driver&#x2019;s gaze vectors and map them to the corresponding attention zones. The proposed method demonstrated improved accuracy and robustness in the zone classification task. This model achieved the accuracies of 75.67&#x0025; and 83.08&#x0025; for attention zone estimation under &#x201C;daytime without eyeglasses&#x201D; and &#x201C;nighttime without eyeglasses&#x201D; capture conditions, respectively. Furthermore, the proposed model surpassed the recent research on that dataset by 73.11&#x0025; and 74.02&#x0025; accuracies under the &#x201C;daytime with eyeglasses&#x201D; and &#x201C;nighttime with eyeglasses&#x201D; capture conditions, respectively.
first_indexed 2024-03-12T02:34:48Z
format Article
id doaj.art-53aa7c3a6b654a609534284772dd1f26
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-12T02:34:48Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-53aa7c3a6b654a609534284772dd1f262023-09-04T23:01:59ZengIEEEIEEE Access2169-35362023-01-0111930989311010.1109/ACCESS.2023.330981010233871An Explainable Attention Zone Estimation for Level 3 Autonomous DrivingRoksana Yahyaabadi0https://orcid.org/0009-0004-1674-7536Soodeh Nikan1Department of Electrical and Computer Engineering, Western University, London, CanadaDepartment of Electrical and Computer Engineering, Western University, London, CanadaAccurately assessing the driver&#x2019;s situational awareness is crucial in level 3 (<inline-formula> <tex-math notation="LaTeX">$L_{3}$ </tex-math></inline-formula>) autonomous driving, where the driver is in the loop. Estimating the attention zone provides essential information about the drivers&#x2019; on/off-road visual attention and determines their readiness to take over the control from the autonomous agent in complicated situations. This paper proposes a double-phase pipeline to improve the explainability and accuracy of the attention zone estimation using an intermediate gaze regression layer, where the true relationships between the input images and output zone labels are interpretable. The proposed GazeMobileNet, a lightweight deep neural network, in the first phase, achieved state-of-the-art performance in estimating the gaze vector in the MPIIGaze dataset, with MAE of 2.37 degrees. The model was used to extract the corresponding gaze vectors from the LISA V2, which is a driving dataset with the in-cabin attention zone labels. As LISA V2 does not contain gaze vector labels, an unsupervised clustering approach was proposed in the second phase to categorize the driver&#x2019;s gaze vectors and map them to the corresponding attention zones. The proposed method demonstrated improved accuracy and robustness in the zone classification task. This model achieved the accuracies of 75.67&#x0025; and 83.08&#x0025; for attention zone estimation under &#x201C;daytime without eyeglasses&#x201D; and &#x201C;nighttime without eyeglasses&#x201D; capture conditions, respectively. Furthermore, the proposed model surpassed the recent research on that dataset by 73.11&#x0025; and 74.02&#x0025; accuracies under the &#x201C;daytime with eyeglasses&#x201D; and &#x201C;nighttime with eyeglasses&#x201D; capture conditions, respectively.https://ieeexplore.ieee.org/document/10233871/Level 3 autonomygaze estimationGazeMobileNetdriver’s attention zoneexplainable clustering
spellingShingle Roksana Yahyaabadi
Soodeh Nikan
An Explainable Attention Zone Estimation for Level 3 Autonomous Driving
IEEE Access
Level 3 autonomy
gaze estimation
GazeMobileNet
driver’s attention zone
explainable clustering
title An Explainable Attention Zone Estimation for Level 3 Autonomous Driving
title_full An Explainable Attention Zone Estimation for Level 3 Autonomous Driving
title_fullStr An Explainable Attention Zone Estimation for Level 3 Autonomous Driving
title_full_unstemmed An Explainable Attention Zone Estimation for Level 3 Autonomous Driving
title_short An Explainable Attention Zone Estimation for Level 3 Autonomous Driving
title_sort explainable attention zone estimation for level 3 autonomous driving
topic Level 3 autonomy
gaze estimation
GazeMobileNet
driver’s attention zone
explainable clustering
url https://ieeexplore.ieee.org/document/10233871/
work_keys_str_mv AT roksanayahyaabadi anexplainableattentionzoneestimationforlevel3autonomousdriving
AT soodehnikan anexplainableattentionzoneestimationforlevel3autonomousdriving
AT roksanayahyaabadi explainableattentionzoneestimationforlevel3autonomousdriving
AT soodehnikan explainableattentionzoneestimationforlevel3autonomousdriving