Nowadays, emotional semantic image retrieval system enables users to access images that they want in a database according to emotional concept. This leads to affective image classification task which recently attracts researchers’ attention. However, different users may experience different emotions depending on where, in the image, they are gazing on. This paper presents an improved prediction method by taking into account the users eye movement as implicit feedback while they are looking at the image. Our experimental results show that using both eye movement information and image feature together to determine users emotion gave more accurate predictions than using image feature alone.
CITATION STYLE
Pasupa, K., Chatkamjuncharoen, P., Wuttilertdeshar, C., & Sugimoto, M. (2016). Using image features and eye tracking device to predict human emotions towards abstract images. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9431, pp. 419–430). Springer Verlag. https://doi.org/10.1007/978-3-319-29451-3_34
Mendeley helps you to discover research relevant for your work.