Classification of visual attention by microsaccades using machine learning
by Soichiro Yokoo; Nobuyuki Nishiuchi; Kimihiro Yamanaka
International Journal of Biometrics (IJBM), Vol. 16, No. 3/4, 2024

Abstract: This paper proposes machine learning methods for classifying visual attention. Eye-tracking data contains a range of useful information related to human visual behaviour. In particular, many recent studies have shown a relationship between visual attention and microsaccades, a type of fixational eye movement. In this study, eye movement and pupil diameter were measured under three controlled experimental conditions requiring different visual attention levels. Microsaccades were extracted from eye-tracking data that included rapid saccades. Various machine learning methods were then used on parameters related to the extracted microsaccades to classify the level of visual attention. By cross-validating data from one participant (test data) with that from other participants (training data), we showed that the support vector machine method had the highest correct discrimination rate (77.1%). These results suggest that it is possible to classify visual attention based on microsaccades.

Online publication date: Tue, 30-Apr-2024

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Biometrics (IJBM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com