Human gaze control during real-world scene perception

https://doi.org/10.1016/j.tics.2003.09.006Get rights and content

Abstract

In human vision, acuity and color sensitivity are best at the point of fixation, and the visual-cognitive system exploits this fact by actively controlling gaze to direct fixation towards important and informative scene regions in real time as needed. How gaze control operates over complex real-world scenes has recently become of central concern in several core cognitive science disciplines including cognitive psychology, visual neuroscience, and machine vision. This article reviews current approaches and empirical findings in human gaze control during real-world scene perception.

Section snippets

Stimulus-based gaze control

Three general approaches have been adopted to investigate the image properties that influence where a viewer will fixate in a scene. First, scene patches centered at each fixation position are analyzed to determine whether they differ in some image property from unselected patches. Using this ‘scene statistics’ approach, investigators have found that high spatial frequency content and edge density are somewhat greater at fixation sites 14, 15, and that local contrast (the standard deviation of

Knowledge-driven gaze control

Human eye movement control is ‘smart’ in the sense that it draws not only on currently available visual input, but also on several cognitive systems, including short-term memory for previously attended information in the current scene, stored long-term visual, spatial and semantic information about other similar scenes, and the goals and plans of the viewer. In fact, fixation sites are less strongly tied to visual saliency when meaningful scenes are viewed during active tasks 23, 35, 36, 37.

Acknowledgements

Preparation of this article was supported by grants from the National Science Foundation (BCS-0094433) and the Army Research Office (DAAD19–00–1-0519; the opinions expressed in this article are those of the authors and do not necessarily represent the views of the Department of the Army or any other governmental organization). I thank Monica Castelhano, Daniel Gajewski, Aaron Pearson, Aude Oliva, and Fernanda Ferreira for their contributions to the ideas presented here and for comments on the

References (74)

  • R. Groner

    Looking at face: local and global aspects of scanpths

  • P.M.J. van Diepen

    Chronometry of foveal information extraction during scene perception

  • M.M. Hayhoe

    Task constraints in visual working memory

    Vision Res.

    (1998)
  • E. Matin

    Saccadic suppression: a review and an analysis

    Psychol. Bull.

    (1974)
  • A. Thiele

    Neural mechanisms of saccadic suppression

    Science

    (2002)
  • M.F. Land

    Motion and vision: why animals move their eyes

    J. Comp. Physiol. Ser. A

    (1999)
  • D.H. Ballard

    Deictic codes for the embodiment of cognition

    Behav. Brain Sci.

    (1997)
  • P.S. Churchland

    A critique of pure vision

  • S.J. Luck et al.

    Attention

  • J.M. Findlay

    Eye scanning and visual search

  • K. Rayner

    Eye movements in reading and information processing: 20 years of research

    Psychol. Bull.

    (1998)
  • M.K. Tanenhaus

    Integration of visual and linguistic information in spoken language comprehension

    Science

    (1995)
  • J.M. Henderson et al.

    Scene perception for psycholinguists

  • S.K. Mannan

    The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images

    Spat. Vis.

    (1996)
  • S.K. Mannan

    Fixation patterns made during brief examination of two-dimensional images

    Perception

    (1997)
  • G. Krieger

    Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics

    Spat. Vis.

    (2000)
  • D.J. Parkhurst et al.

    Scene content selected by active vision

    Spat. Vis.

    (2003)
  • P. Reinagel et al.

    Natural scene statistics at the centre of gaze

    Network

    (1999)
  • L. Itti et al.

    Computational modeling of visual attention

    Nat. Rev. Neurosci.

    (2001)
  • C. Koch et al.

    Shifts in selective visual attention: towards the underlying neural circuitry

    Hum. Neurobiol.

    (1985)
  • A. Torralba

    Modeling global scene factors in attention

    J. Opt. Soc. Am. A Opt. Image Sci. Vis.

    (2003)
  • A. Oliva

    Top-down control of visual attention in object detection

    (2003)
  • J.M. Henderson et al.

    Global transsaccadic change blindness during scene perception

    Psychol. Sci.

    (2003)
  • D.E. Irwin

    Visual memory within and across fixations

  • M.I. Posner

    Inhibition of return: neural basis and function

    Cogn. Neuropsychol.

    (1985)
  • E.C. Leek

    Inhibition of return for objects and locations in static displays

    Percept. Psychophys.

    (2003)
  • M. Corbetta et al.

    Control of goal-directed and stimulus-driven attention in the brain

    Nat. Rev. Neurosci.

    (2002)
  • Cited by (1066)

    View all citing articles on Scopus
    View full text