Eye movement feature extraction for driver vigilance classification

Along with the advanced and mature development of eye tracking technology, eye movement tracking shows great potential in the construction of driver assistance system, which may provide services for driver at early stage of low vigilance. However, the raw data from eye tracker is time dependent and...

Full description

Bibliographic Details
Main Author: Fu, Zhuxuan
Other Authors: Huang Guangbin
Format: Final Year Project (FYP)
Language:English
Published: 2015
Subjects:
Online Access:http://hdl.handle.net/10356/64350
_version_ 1811697608187445248
author Fu, Zhuxuan
author2 Huang Guangbin
author_facet Huang Guangbin
Fu, Zhuxuan
author_sort Fu, Zhuxuan
collection NTU
description Along with the advanced and mature development of eye tracking technology, eye movement tracking shows great potential in the construction of driver assistance system, which may provide services for driver at early stage of low vigilance. However, the raw data from eye tracker is time dependent and cannot be directly utilized for machine learning. Also, the existing feature extraction software Tobbi Studio cannot be customized based on needs and is not real time. Hence, a new eye movement feature extraction tool needs to be developed for the convenience of future research. In this report, the development of an eye movement feature extraction tool based on Velocity and Dispersion Threshold Identification (I-VDT) algorithm was introduced. This tool may extract fixations, saccades and smooth pursuit these three fundamental eye movements with satisfactory accuracy. Especially for the extraction of fixations and saccades, accuracy was generally above 95%. Through experiments, recommendations were given on parameter setting. 30º/s (º/s means deg/sec in this report) is suggested to be the velocity threshold that separates fixations and saccades. Also, the performance of the algorithm is not sensitive to the minimum fixation duration threshold in the range of 60 seconds to 100 seconds. In the second part of the project, statistical analysis was conducted on twelve eye movement feature metrics based the data collected from Vigilance Decrement Experiment, which is run by NTU HMI Lab. Analysis of variance (ANOVA) test was employed to check if there exist significant differences among three stages within one driving period. Both mean value and variance (standard deviation) of fixation duration, fixation centroid location, saccade peak velocity/duration, saccade velocity and saccade duration presents differ among stages with 95% confidence level for more than half of the subjects. Hence, these metrics are potential to be utilized in machine learning.
first_indexed 2024-10-01T07:57:58Z
format Final Year Project (FYP)
id ntu-10356/64350
institution Nanyang Technological University
language English
last_indexed 2024-10-01T07:57:58Z
publishDate 2015
record_format dspace
spelling ntu-10356/643502023-07-07T17:23:14Z Eye movement feature extraction for driver vigilance classification Fu, Zhuxuan Huang Guangbin Lin Zhiping School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems Along with the advanced and mature development of eye tracking technology, eye movement tracking shows great potential in the construction of driver assistance system, which may provide services for driver at early stage of low vigilance. However, the raw data from eye tracker is time dependent and cannot be directly utilized for machine learning. Also, the existing feature extraction software Tobbi Studio cannot be customized based on needs and is not real time. Hence, a new eye movement feature extraction tool needs to be developed for the convenience of future research. In this report, the development of an eye movement feature extraction tool based on Velocity and Dispersion Threshold Identification (I-VDT) algorithm was introduced. This tool may extract fixations, saccades and smooth pursuit these three fundamental eye movements with satisfactory accuracy. Especially for the extraction of fixations and saccades, accuracy was generally above 95%. Through experiments, recommendations were given on parameter setting. 30º/s (º/s means deg/sec in this report) is suggested to be the velocity threshold that separates fixations and saccades. Also, the performance of the algorithm is not sensitive to the minimum fixation duration threshold in the range of 60 seconds to 100 seconds. In the second part of the project, statistical analysis was conducted on twelve eye movement feature metrics based the data collected from Vigilance Decrement Experiment, which is run by NTU HMI Lab. Analysis of variance (ANOVA) test was employed to check if there exist significant differences among three stages within one driving period. Both mean value and variance (standard deviation) of fixation duration, fixation centroid location, saccade peak velocity/duration, saccade velocity and saccade duration presents differ among stages with 95% confidence level for more than half of the subjects. Hence, these metrics are potential to be utilized in machine learning. Bachelor of Engineering 2015-05-26T03:42:24Z 2015-05-26T03:42:24Z 2015 Final Year Project (FYP) http://hdl.handle.net/10356/64350 en Nanyang Technological University 55 p. application/pdf
spellingShingle DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Fu, Zhuxuan
Eye movement feature extraction for driver vigilance classification
title Eye movement feature extraction for driver vigilance classification
title_full Eye movement feature extraction for driver vigilance classification
title_fullStr Eye movement feature extraction for driver vigilance classification
title_full_unstemmed Eye movement feature extraction for driver vigilance classification
title_short Eye movement feature extraction for driver vigilance classification
title_sort eye movement feature extraction for driver vigilance classification
topic DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
url http://hdl.handle.net/10356/64350
work_keys_str_mv AT fuzhuxuan eyemovementfeatureextractionfordrivervigilanceclassification