Mobile 3D Gaze Tracking Calibration

Abstract

We present a new calibration method to combine a mobile eye tracker with an external tracking system to obtain a 3D gaze vector. Our method captures calibration points of varying distances, pupil positions and head positions/orientations. With these data we can determine the eye position relative to the user’s head position without separate manual eye-position measurements. For this approach, it is not necessary to know the orientation of the eye coordinate system in advance. In addition to the calibration of the external tracking system calibration, we can calibrate the head-tracked eye tracker in a one-step process, requiring the user to look at the calibration points. No extra calibration of the eye tracker is necessary, if the raw pupil position in the eye-camera is available from the eye tracker. The calibrated system allows us to estimate the 3D gaze vector for a user who can move freely within the range of the external tracking system. Our evaluation shows that the average accuracy of the visual angle is better than one degree in a self evaluation and approximately two degrees under unrestrained head movement.

Publication
2015 12th Conference on Computer and Robot Vision (CRV)