Review
Visual search in scenes involves selective and nonselective pathways

https://doi.org/10.1016/j.tics.2010.12.001Get rights and content

How does one find objects in scenes? For decades, visual search models have been built on experiments in which observers search for targets, presented among distractor items, isolated and randomly arranged on blank backgrounds. Are these models relevant to search in continuous scenes? This article argues that the mechanisms that govern artificial, laboratory search tasks do play a role in visual search in scenes. However, scene-based information is used to guide search in ways that had no place in earlier models. Search in scenes might be best explained by a dual-path model: a ‘selective’ path in which candidate objects must be individually selected for recognition and a ‘nonselective’ path in which information can be extracted from global and/or statistical information.

Section snippets

Searching and experiencing a scene

It is an interesting aspect of visual experience that you can look for an object that is, literally, right in front of your eyes, yet not find it for an appreciable period of time. It is clear that you are seeing something at the location of the object before you find it. What is that something and how do you go about finding that desired object? These questions have occupied visual search researchers for decades. Whereas visual search papers have conventionally described search as an important

Classic guided search

One approach to search, developed from studies of simple stimuli randomly placed on blank backgrounds, can be called ‘classic guided search’ [1]. It has roots in Treisman's Feature Integration Theory [2]. As we briefly review below, it holds that search is necessary because object recognition processes are limited to one or, perhaps, a few objects at one time. The selection of candidate objects for subsequent recognition is guided by preattentively acquired information about a limited set of

The failure of classic guided search

To this point, we have described what could be called ‘classic guided search’ 1, 25. Now, suppose that we wanted to apply this classic guided search theory to the real world. Find the bread in Figure 3a. Guided search, and similar models, would say that the one to two dozen guiding attributes define a high-dimensional space in which objects would be quite sparsely represented. That is, ‘bread’ would be defined by some set of features [21]. If attention were guided to objects lying in the

A nonselective pathway to gist processing

Fortunately, there is another route to semantic scene information. Humans are able to categorize a scene as a forest without selecting individual trees for recognition [54]. A single, brief fixation on the kitchen of Figure 3a would be enough to get the ‘gist’ of that scene. ‘Gist’ is an imperfectly defined term but, in this context, it includes the basic-level category of the scene, an estimate of the distributions of basic attributes, such as color and texture [55], and the spatial layout 54,

Concluding remarks

What is next in the study of search in scenes? It is still not understood how scenes are divided up into searchable objects or proto-objects [76]. There is much work to be done to describe fully the capabilities of nonselective processing and even more to document its impact on selective processes. Finally, we would like to know if there is a neurophysiological reality to the two pathways proposed here. Suppose one ‘lesioned’ the hypothetical selective pathway. The result might be an agnosic

Acknowledgments

This work was supported by NIH EY017001 and ONR MURI N000141010278 to J.M.W. K.K.E. was supported by NIH/NEI 1F32EY019819-01, M.R.G. by NIH/NEI F32EY019815-01and M.L-H.V. by DFG 1683/1-1.

References (90)

  • I. Biederman

    Scene perception: detecting and judging objects undergoing relational violations

    Cognit. Psychol.

    (1982)
  • M.B. Neider et al.

    Scene context guides eye movements during visual search

    Vision Res.

    (2006)
  • M.R. Greene et al.

    Recognition of natural scenes from global properties: seeing the forest without representing the trees

    Cognit. Psychol.

    (2009)
  • T. Sanocki

    Representation and perception of scenic layout

    Cognit. Psychol.

    (2003)
  • O.R. Joubert

    Processing scene context: fast categorization and object interference

    Vision Res.

    (2007)
  • S.C. Chong et al.

    Representation of statistical properties

    Vision Res.

    (2003)
  • C. Chubb

    The three dimensions of human visual sensitivity to first-order contrast statistics

    Vision Res.

    (2007)
  • D.W. Williams et al.

    Coherent global motion percepts from stochastic local motions

    Vision Res.

    (1984)
  • N. Demeyere

    Automatic statistical processing of visual properties in simultanagnosia

    Neuropsychologia

    (2008)
  • D. Melcher et al.

    Shapes, surfaces and saccades

    Vision Res.

    (1999)
  • J. Haberman et al.

    Rapid extraction of mean emotion and gender from sets of faces

    Curr. Biol.

    (2007)
  • T.J. Buschman et al.

    Serial, covert shifts of attention during visual search are reflected by the frontal eye fields and correlated with population oscillations

    Neuron

    (2009)
  • J.M. Wolfe

    Guided search 2.0. A revised model of visual search

    Psychon. Bull. Rev.

    (1994)
  • G. Müller-Plath et al.

    Space-based and object-based capacity limitations in visual search

    Vis. Cogn.

    (2007)
  • B.A. Dosher

    Information-limited parallel processing in difficult heterogeneous covert visual search

    J. Exp. Psychol. Hum. Percept. Perform.

    (2010)
  • D.G. Pelli et al.

    The uncrowded window of object recognition

    Nat. Neurosci.

    (2008)
  • B. Balas

    A summary-statistic representation in peripheral vision explains visual crowding

    J. Vis.

    (2009)
  • M.P. Eckstein

    The lower visual search efficiency for conjunctions is due to noise and not serial attentional processing

    Psychol. Sci.

    (1998)
  • J.T. Townsend et al.

    The serial–parallel dilemma: a case study in a linkage of theory and method

    Psychon. Bull. Rev.

    (2004)
  • T.S. Horowitz

    Revisiting the variable memory model of visual search

    Vis. Cogn.

    (2006)
  • J. Theeuwes

    A new estimation of the duration of attentional dwell time

    Psychon. Bull. Rev.

    (2004)
  • C.M. Moore et al.

    Getting beyond the serial/parallel debate in visual search: a hybrid approach

  • T.L. Thornton et al.

    Parallel and serial processes in visual search

    Psychol. Rev.

    (2007)
  • L.W. Renninger

    Where to look next? Eye movements reduce local uncertainty

    J. Vis.

    (2007)
  • W.S. Geisler

    Visual search: the role of peripheral information measured using gaze-contingent displays

    J. Vis.

    (2006)
  • G.J. Zelinsky

    A theory of eye movements during target acquisition

    Psychol. Rev.

    (2008)
  • G.J. Zelinsky et al.

    Eye movements during parallel-serial visual search

    J. Exp. Psychol. Hum. Percept. Perform.

    (1997)
  • L. Huang

    What is the unit of visual attention? Object for selection, but Boolean map for access

    J. Exp. Psychol. Gen.

    (2010)
  • J.M. Wolfe

    Guided Search 4.0: current Progress with a model of visual search

  • J.M. Wolfe et al.

    What attributes guide the deployment of visual attention and how do they do it?

    Nat. Rev. Neurosci.

    (2004)
  • J.M. Wolfe et al.

    Do intersections serve as basic features in visual search?

    Perception

    (2003)
  • J.M. Wolfe

    Limitations on the parallel guidance of visual search: color × color and orientation × orientation conjunctions

    J. Exp. Psychol. Hum. Percept. Perform.

    (1990)
  • A. Treisman et al.

    Feature analysis in early vision: evidence from search asymmetries

    Psychol. Rev.

    (1988)
  • L. Itti

    A model of saliency-based visual attention for rapid scene analysis

    IEEE Trans. Pattern Anal. Machine Intell.

    (2002)
  • D.T. Lindsey

    Color channels, not color appearance or color categories, guide visual search for desaturated color targets

    Psychol. Sci.

    (2010)
  • Cited by (369)

    • Good-enough attentional guidance

      2023, Trends in Cognitive Sciences
    View all citing articles on Scopus
    View full text