skip to main content
10.1145/958432.958464acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

Eyetracking in cognitive state detection for HCI

Published:05 November 2003Publication History

ABSTRACT

1. Past research in a number of fields confirms the existence of a link between cognition and eye movement control, beyond simply a pointing relationship. This being the case, it should be possible to use eye movement recording as a basis for detecting users' cognitive states in real time. Several examples of such cognitive state detectors have been reported in the literature.2. A multi-disciplinary project is described in which the goal is to provide the computer with as much real-time information about the human state (cognitive, affective and motivational state) as possible, and to base computer actions on this information. The application area in which this is being implemented is science education, learning about gears through exploration. Two studies are reported in which participants solve simple problems of pictured gear trains while their eye movements are recorded. The first study indicates that most eye movement sequences are compatible with predictions of a simple sequential cognitive model, and it is suggested that those sequences that do not fit the model may be of particular interest in the HCI context as indicating problems or alternative mental strategies. The mental rotation of gears sometimes produces sequences of short eye movements in the direction of motion; thus, such sequences may be useful as cognitive state detecto.3. The second study tested the hypothesis that participants are thinking about the object to which their eyes are directed. In this study, the display was turned off partway through the process of solving a problem, and the participants reported what they were thinking about at that time. While in most cases the participants reported cognitive activities involving the fixated object, this was not the case on a sizeable number of trials.

References

  1. Yang, H. M., & McConkie, G. W. (1999). Reading Chinese: Some basic eye movement characteristics. In J. Wang, A. Inhoff, & H. C. Chen (Eds.) Reading Chinese Script: A Cognitive Analysis (pp. 207--222). Hillsdale, NJ: Erlbaum.Google ScholarGoogle Scholar
  2. Yarbus, A. L. (1967). Eye Movements and Vision. New York: Plenum Press.Google ScholarGoogle Scholar
  3. Findlay, J. M., & Gilchrist, I. D. (1998). Eye guidance and visual search. In G. Underwood (Ed.), Eye Guidance in Reading and Scene Perception (pp. 295-312). Amsterdam: Elsevier.Google ScholarGoogle Scholar
  4. Salvucci, D. D., & Anderson, J. R. (2001). Automated eye-movement protocol analysis. Human-Computer Interaction, 16,39--86. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Althoff, R., Cohen, N. J., McConkie, G. W., Wasserman, S.,Maciukenas, M., Azen, R., & Romine, L. (1998). Eye movement-based memory assessment. In W. Becker, H. Deubel, & T. Mergner (Eds.) Current Oculomotor Research: Physiological and Psychological Aspects. New York: Plenum.Google ScholarGoogle Scholar
  6. Jacob, R. J. K. (1991). The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems, 9(2), 152--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Edwards, G. (1998). A tool for creating eye-aware applications that adapt to changes in user behavior. Proceedings of ASSETS 98, 67--74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Campbell, C. S. & Maglio, P. P. (2001). A robust algorithm for reading detection. In Proceedings of the ACM Workshop on Perceptual User Interfaces. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Reichle, E. D., Rayner, K., & Pollatsek, A. (in press). The E-Z Reader model of eye movement control in reading: Comparisons to other models. Behavioral and Brain Sciences.Google ScholarGoogle Scholar
  10. Yang, S.-N., & McConkie, G. M. (2001). Eye movements during reading: A theory of saccade initiation times. Vision Research, 41(25-26), 3567--3585.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Eyetracking in cognitive state detection for HCI

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICMI '03: Proceedings of the 5th international conference on Multimodal interfaces
      November 2003
      318 pages
      ISBN:1581136218
      DOI:10.1145/958432

      Copyright © 2003 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 November 2003

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      ICMI '03 Paper Acceptance Rate45of130submissions,35%Overall Acceptance Rate453of1,080submissions,42%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader