2D Gaze Estimation Based on Pupil-Glint Vector Using an Artificial Neural Network

Gaze estimation methods play an important role in a gaze tracking system. A novel 2D gaze estimation method based on the pupil-glint vector is proposed in this paper. First, the circular ring rays location (CRRL) method and Gaussian fitting are utilized for pupil and glint detection, respectively. T...

Full description

Bibliographic Details
Main Authors: Jianzhong Wang, Guangyue Zhang, Jiadong Shi
Format: Article
Language:English
Published: MDPI AG 2016-06-01
Series:Applied Sciences
Subjects:
Online Access:http://www.mdpi.com/2076-3417/6/6/174
Description
Summary:Gaze estimation methods play an important role in a gaze tracking system. A novel 2D gaze estimation method based on the pupil-glint vector is proposed in this paper. First, the circular ring rays location (CRRL) method and Gaussian fitting are utilized for pupil and glint detection, respectively. Then the pupil-glint vector is calculated through subtraction of pupil and glint center fitting. Second, a mapping function is established according to the corresponding relationship between pupil-glint vectors and actual gaze calibration points. In order to solve the mapping function, an improved artificial neural network (DLSR-ANN) based on direct least squares regression is proposed. When the mapping function is determined, gaze estimation can be actualized through calculating gaze point coordinates. Finally, error compensation is implemented to further enhance accuracy of gaze estimation. The proposed method can achieve a corresponding accuracy of 1.29°, 0.89°, 0.52°, and 0.39° when a model with four, six, nine, or 16 calibration markers is utilized for calibration, respectively. Considering error compensation, gaze estimation accuracy can reach 0.36°. The experimental results show that gaze estimation accuracy of the proposed method in this paper is better than that of linear regression (direct least squares regression) and nonlinear regression (generic artificial neural network). The proposed method contributes to enhancing the total accuracy of a gaze tracking system.
ISSN:2076-3417