Skip to main content
Top
Gepubliceerd in: Psychological Research 2/2022

22-03-2021 | Original Article

Pictorial low-level features in mental images: evidence from eye fixations

Auteurs: Corinna S. Martarelli, Fred W. Mast

Gepubliceerd in: Psychological Research | Uitgave 2/2022

Log in om toegang te krijgen
share
DELEN

Deel dit onderdeel of sectie (kopieer de link)

  • Optie A:
    Klik op de rechtermuisknop op de link en selecteer de optie “linkadres kopiëren”
  • Optie B:
    Deel de link per e-mail

Abstract

It is known that eye movements during object imagery reflect areas visited during encoding. But will eye movements also reflect pictorial low-level features of imagined stimuli? In this paper, three experiments are reported in which we investigate whether low-level properties of mental images elicit specific eye movements. Based on the conceptualization of mental images as depictive representations, we expected low-level visual features to influence eye fixations during mental imagery, in the absence of a visual input. In a first experiment, twenty-five participants performed a visual imagery task with high vs. low spatial frequency and high vs. low contrast gratings. We found that both during visual perception and during mental imagery, first fixations were more often allocated to the low spatial frequency–high contrast grating, thus showing that eye fixations were influenced not only by physical properties of visual stimuli but also by its imagined counterpart. In a second experiment, twenty-two participants imagined high contrast and low contrast stimuli that they had not encoded before. Again, participants allocated more fixations to the high contrast mental images than to the low contrast mental images. In a third experiment, we ruled out task difficulty as confounding variable. Our results reveal that low-level visual features are represented in the mind’s eye and thus, they contribute to the characterization of mental images in terms of how much perceptual information is re-instantiated during mental imagery.
Bijlagen
Alleen toegankelijk voor geautoriseerde gebruikers
Literatuur
go back to reference Albers, A. M., Kok, P., Toni, I., Dijkerman, H. C., & de Lange, F. P. (2013). Shared representations for working memory and mental imagery in early visual cortex. Current Biology, 23(15), 1427–1431.PubMed Albers, A. M., Kok, P., Toni, I., Dijkerman, H. C., & de Lange, F. P. (2013). Shared representations for working memory and mental imagery in early visual cortex. Current Biology, 23(15), 1427–1431.PubMed
go back to reference Albright, T. D. (2012). On the perception of probable things: Neural substrates of associative memory, imagery, and perception. Neuron, 74(2), 227–245.PubMedPubMedCentral Albright, T. D. (2012). On the perception of probable things: Neural substrates of associative memory, imagery, and perception. Neuron, 74(2), 227–245.PubMedPubMedCentral
go back to reference Altmann, G. T. (2004). Language-mediated eye movements in the absence of a visual world: The “blank screen paradigm.” Cognition, 93(2), B79-87.PubMed Altmann, G. T. (2004). Language-mediated eye movements in the absence of a visual world: The “blank screen paradigm.” Cognition, 93(2), B79-87.PubMed
go back to reference Baddeley, R. J., & Tatler, B. W. (2006). High frequency edges (but not contrast) predict where we fixate: A Bayesian system identification analysis. Vision Research, 46(18), 2824–2833.PubMed Baddeley, R. J., & Tatler, B. W. (2006). High frequency edges (but not contrast) predict where we fixate: A Bayesian system identification analysis. Vision Research, 46(18), 2824–2833.PubMed
go back to reference Bone, M. B., St-Laurent, M., Dang, C., McQuiggan, D. A., & Ryan, J. D. B. (2019). Eye movement reinstatement and neural reactivation during mental imagery. Cerebral Cortex, 29(3), 1075–1089.PubMed Bone, M. B., St-Laurent, M., Dang, C., McQuiggan, D. A., & Ryan, J. D. B. (2019). Eye movement reinstatement and neural reactivation during mental imagery. Cerebral Cortex, 29(3), 1075–1089.PubMed
go back to reference Borji, A., Sihite, D. N., & Itti, L. (2013). What stands out in a scene? A study of human explicit saliency judgment. Vision Research, 91, 62–67.PubMed Borji, A., Sihite, D. N., & Itti, L. (2013). What stands out in a scene? A study of human explicit saliency judgment. Vision Research, 91, 62–67.PubMed
go back to reference Brandt, S. A., & Stark, L. W. (1997). Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience, 9(1), 27–38.PubMed Brandt, S. A., & Stark, L. W. (1997). Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience, 9(1), 27–38.PubMed
go back to reference Broggin, E., Savazzi, S., & Marzi, C. A. (2012). Similar effects of visual perception and imagery on simple reaction time. The Quarterly Journal of Experimental Psychology, 65(1), 151–164.PubMed Broggin, E., Savazzi, S., & Marzi, C. A. (2012). Similar effects of visual perception and imagery on simple reaction time. The Quarterly Journal of Experimental Psychology, 65(1), 151–164.PubMed
go back to reference Chiquet, S., Martarelli, C.S., & Mast, F.W. (2020). Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality. Virtual Reality. Chiquet, S., Martarelli, C.S., & Mast, F.W. (2020). Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality. Virtual Reality.
go back to reference Dijkstra, N., Bosch, S. E., & van Gerven, M. A. J. (2019). Shared neural mechanisms of visual perception and imagery. Trends in Cognitive Sciences, 23(5), 423–434.PubMed Dijkstra, N., Bosch, S. E., & van Gerven, M. A. J. (2019). Shared neural mechanisms of visual perception and imagery. Trends in Cognitive Sciences, 23(5), 423–434.PubMed
go back to reference Einhäuser, W., & König, P. (2003). Does luminance-contrast contribute to a saliency map for overt visual attention? European Journal of Neuroscience, 17(5), 1089–1097. Einhäuser, W., & König, P. (2003). Does luminance-contrast contribute to a saliency map for overt visual attention? European Journal of Neuroscience, 17(5), 1089–1097.
go back to reference Einhäuser, W., Spain, M., & Perona, P. (2008). Objects predict fixations better than early saliency. Journal of Vision, 8(14), 1–26.PubMed Einhäuser, W., Spain, M., & Perona, P. (2008). Objects predict fixations better than early saliency. Journal of Vision, 8(14), 1–26.PubMed
go back to reference Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.PubMed Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191.PubMed
go back to reference Fourtassi, M., Hajjioui, A., Urquizar, C., Rossetti, Y., Rode, G., & Pisella, L. (2013). Iterative fragmentation of cognitive maps in a visual imagery task. PLoS ONE, 8(7), e68560.PubMedPubMedCentral Fourtassi, M., Hajjioui, A., Urquizar, C., Rossetti, Y., Rode, G., & Pisella, L. (2013). Iterative fragmentation of cognitive maps in a visual imagery task. PLoS ONE, 8(7), e68560.PubMedPubMedCentral
go back to reference Ganis, G., Thompson, W. L., & Kosslyn, S. M. (2004). Brain areas underlying visual mental imagery and visual perception: An fMRI study. Cognitive Brain Research, 20(2), 226–241.PubMed Ganis, G., Thompson, W. L., & Kosslyn, S. M. (2004). Brain areas underlying visual mental imagery and visual perception: An fMRI study. Cognitive Brain Research, 20(2), 226–241.PubMed
go back to reference Henderson, J. M., Weeks, P. A. J., & Hollingworth, A. (1999). The effects of semantic consistency on eye movements during complex scene viewing. Journal of Experimental Psychology: Human Perception and Performance, 25(1), 210–228. Henderson, J. M., Weeks, P. A. J., & Hollingworth, A. (1999). The effects of semantic consistency on eye movements during complex scene viewing. Journal of Experimental Psychology: Human Perception and Performance, 25(1), 210–228.
go back to reference Itti, L. (2005). Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition, 12(6), 1093–1123. Itti, L. (2005). Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition, 12(6), 1093–1123.
go back to reference Itti, L., & Borji, A. (2014). Computational models: bottom-up and top-down aspects. In A. C. Nobre & S. Kastner (Eds.), The Oxford Handbook of Attention (pp. 1122–1158). Oxford University Press. Itti, L., & Borji, A. (2014). Computational models: bottom-up and top-down aspects. In A. C. Nobre & S. Kastner (Eds.), The Oxford Handbook of Attention (pp. 1122–1158). Oxford University Press.
go back to reference Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10–12), 1489–1506.PubMed Itti, L., & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10–12), 1489–1506.PubMed
go back to reference Itti, L., & Koch, C. (2001). Computational modeling of visual attention. Nature Reviews Neuroscience, 2(3), 194–203.PubMed Itti, L., & Koch, C. (2001). Computational modeling of visual attention. Nature Reviews Neuroscience, 2(3), 194–203.PubMed
go back to reference Johansson, R., Holsanova, J., Dewhurst, R., & Holmqvist, K. (2012). Eye movements during scene recollection have a functional role, but they are not reinstatements of those produced during encoding. Journal of Experimental Psychology: Human Perception and Performance, 38(5), 1289–1314.PubMed Johansson, R., Holsanova, J., Dewhurst, R., & Holmqvist, K. (2012). Eye movements during scene recollection have a functional role, but they are not reinstatements of those produced during encoding. Journal of Experimental Psychology: Human Perception and Performance, 38(5), 1289–1314.PubMed
go back to reference Johansson, R., Holsanova, J., & Holmqvist, K. (2006). Pictures and spoken descriptions elicit similar eye movements during mental imagery, both in light and in complete darkness. Cognitive Science, 30(6), 1053–1079.PubMed Johansson, R., Holsanova, J., & Holmqvist, K. (2006). Pictures and spoken descriptions elicit similar eye movements during mental imagery, both in light and in complete darkness. Cognitive Science, 30(6), 1053–1079.PubMed
go back to reference Johansson, R., & Johansson, M. (2014). Look here, eye movements play a functional role in memory retrieval. Psychological Science, 25(1), 236–242.PubMed Johansson, R., & Johansson, M. (2014). Look here, eye movements play a functional role in memory retrieval. Psychological Science, 25(1), 236–242.PubMed
go back to reference Johnson, M. R., & Johnson, M. K. (2014). Decoding individual natural scene representations during perception and imagery. Frontiers in Human Neuroscience, 8, 59.PubMedPubMedCentral Johnson, M. R., & Johnson, M. K. (2014). Decoding individual natural scene representations during perception and imagery. Frontiers in Human Neuroscience, 8, 59.PubMedPubMedCentral
go back to reference Kienzle, W., Franz, M. O., Schölkopf, B., & Wichmann, F. A. (2009). Center-surround patterns emerge as optimal predictors for human saccade targets. Journal of Vision, 9(5), 1–15.PubMed Kienzle, W., Franz, M. O., Schölkopf, B., & Wichmann, F. A. (2009). Center-surround patterns emerge as optimal predictors for human saccade targets. Journal of Vision, 9(5), 1–15.PubMed
go back to reference Koch, C., & Ullman, S. (1985). Shifts in selective visual attention: Towards the underlying neural circuitry. Human Neurobiology, 4, 219–227.PubMed Koch, C., & Ullman, S. (1985). Shifts in selective visual attention: Towards the underlying neural circuitry. Human Neurobiology, 4, 219–227.PubMed
go back to reference Kosslyn, S. M., Sukel, K. E., & Bly, B. M. (1999). Squinting with the mind’s eye: Effects of stimulus resolution on imaginal and perceptual comparisons. Memory and Cognition, 27(2), 276–287.PubMed Kosslyn, S. M., Sukel, K. E., & Bly, B. M. (1999). Squinting with the mind’s eye: Effects of stimulus resolution on imaginal and perceptual comparisons. Memory and Cognition, 27(2), 276–287.PubMed
go back to reference Kosslyn, S. M., Thompson, W. L., & Ganis, G. (2006). The Case for Mental Imagery. Oxford University Press. Kosslyn, S. M., Thompson, W. L., & Ganis, G. (2006). The Case for Mental Imagery. Oxford University Press.
go back to reference Laeng, B., Bloem, I. M., D’Ascenzo, S., & Tommasi, L. (2014). Scrutinizing visual images: The role of gaze in mental imagery and memory. Cognition, 131(2), 263–283.PubMed Laeng, B., Bloem, I. M., D’Ascenzo, S., & Tommasi, L. (2014). Scrutinizing visual images: The role of gaze in mental imagery and memory. Cognition, 131(2), 263–283.PubMed
go back to reference Laeng, B., & Sulutvedt, U. (2014). The eye pupil adjusts to imaginary light. Psychological Science, 25(1), 188–197.PubMed Laeng, B., & Sulutvedt, U. (2014). The eye pupil adjusts to imaginary light. Psychological Science, 25(1), 188–197.PubMed
go back to reference Laeng, B., & Teodorescu, D.-S. (2002). Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science, 26(2), 207–231. Laeng, B., & Teodorescu, D.-S. (2002). Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science, 26(2), 207–231.
go back to reference Lauwereyns, J. (2012). Brain and the Gaze. On the Active Boundaries of Vision. The MIT Press. Lauwereyns, J. (2012). Brain and the Gaze. On the Active Boundaries of Vision. The MIT Press.
go back to reference Le Meur, O., Le Callet, P., & Barba, D. (2007). Predicting visual fixations on video based on low-level visual features. Vision Research, 47(19), 2483–2498.PubMed Le Meur, O., Le Callet, P., & Barba, D. (2007). Predicting visual fixations on video based on low-level visual features. Vision Research, 47(19), 2483–2498.PubMed
go back to reference Lee, S. H., Kravitz, D. J., & Baker, C. I. (2012). Disentangling visual imagery and perception of real-world objects. NeuroImage, 59(4), 4064–4073.PubMed Lee, S. H., Kravitz, D. J., & Baker, C. I. (2012). Disentangling visual imagery and perception of real-world objects. NeuroImage, 59(4), 4064–4073.PubMed
go back to reference Mannan, S. K., Ruddock, K. H., & Wooding, D. S. (1996). The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial Vision, 10(3), 165–188.PubMed Mannan, S. K., Ruddock, K. H., & Wooding, D. S. (1996). The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial Vision, 10(3), 165–188.PubMed
go back to reference Martarelli, C. S., Chiquet, S., Laeng, B., & Mast, F. W. (2017). Using space to represent categories: Insights from gaze position. Psychological Research, 81(4), 721–729.PubMed Martarelli, C. S., Chiquet, S., Laeng, B., & Mast, F. W. (2017). Using space to represent categories: Insights from gaze position. Psychological Research, 81(4), 721–729.PubMed
go back to reference Martarelli, C. S., & Mast, F. W. (2011). Preschool children’s eye-movements during pictorial recall. British Journal of Developmental Psychology, 29, 425–436. Martarelli, C. S., & Mast, F. W. (2011). Preschool children’s eye-movements during pictorial recall. British Journal of Developmental Psychology, 29, 425–436.
go back to reference Martarelli, C. S., & Mast, F. W. (2013). Eye movements during long-term pictorial recall. Psychological Research, 77(3), 303–309.PubMed Martarelli, C. S., & Mast, F. W. (2013). Eye movements during long-term pictorial recall. Psychological Research, 77(3), 303–309.PubMed
go back to reference Naselaris, T., Olman, C. A., Stansbury, D. E., Ugurbil, K., & Gallant, J. L. (2015). A voxel- wise encoding model for early visual areas decodes mental images of remembered scenes. NeuroImage, 105, 215–228.PubMed Naselaris, T., Olman, C. A., Stansbury, D. E., Ugurbil, K., & Gallant, J. L. (2015). A voxel- wise encoding model for early visual areas decodes mental images of remembered scenes. NeuroImage, 105, 215–228.PubMed
go back to reference Parkhurst, D., Law, K., & Niebur, E. (2002). Modeling the role of salience in the allocation of overt visual attention. Vision Research, 42(1), 107–123.PubMed Parkhurst, D., Law, K., & Niebur, E. (2002). Modeling the role of salience in the allocation of overt visual attention. Vision Research, 42(1), 107–123.PubMed
go back to reference Pearson, J., Naselaris, T., Holmes, E. A., & Kosslyn, S. M. (2015). Mental imagery: Functional Mechanisms and clinical applications. Trends in Cognitive Sciences, 19(10), 590–602.PubMedPubMedCentral Pearson, J., Naselaris, T., Holmes, E. A., & Kosslyn, S. M. (2015). Mental imagery: Functional Mechanisms and clinical applications. Trends in Cognitive Sciences, 19(10), 590–602.PubMedPubMedCentral
go back to reference Richardson, D. C., & Spivey, M. J. (2000). Representation, space and Hollywood Squares: Looking at things that aren’t there anymore. Cognition, 76(3), 269–295.PubMed Richardson, D. C., & Spivey, M. J. (2000). Representation, space and Hollywood Squares: Looking at things that aren’t there anymore. Cognition, 76(3), 269–295.PubMed
go back to reference Rouw, R., Kosslyn, S. M., & Hamel, R. (1997). Detecting high-level and low-level properties in visual images and visual percepts. Cognition, 63(2), 209–226.PubMed Rouw, R., Kosslyn, S. M., & Hamel, R. (1997). Detecting high-level and low-level properties in visual images and visual percepts. Cognition, 63(2), 209–226.PubMed
go back to reference Scholz, A., Klichowicz, A., & Krems, J. F. (2018). Covert shifts of attention can account for the functional role of “eye movements to nothing.” Memory & Cognition, 46(2), 230–243. Scholz, A., Klichowicz, A., & Krems, J. F. (2018). Covert shifts of attention can account for the functional role of “eye movements to nothing.” Memory & Cognition, 46(2), 230–243.
go back to reference Scholz, A., Mehlhorn, K., & Krems, J. F. (2016). Listen up, eye movements play a role in verbal memory retrieval. Psychological Research, 80(1), 149–158.PubMed Scholz, A., Mehlhorn, K., & Krems, J. F. (2016). Listen up, eye movements play a role in verbal memory retrieval. Psychological Research, 80(1), 149–158.PubMed
go back to reference Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11(5), 1–30. Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11(5), 1–30.
go back to reference Slotnick, S. D., Thompson, W. L., & Kosslyn, S. M. (2005). Visual mental imagery induces retinotopically organized activation of early visual areas. Cerebral Cortex, 15(10), 1570–1583.PubMed Slotnick, S. D., Thompson, W. L., & Kosslyn, S. M. (2005). Visual mental imagery induces retinotopically organized activation of early visual areas. Cerebral Cortex, 15(10), 1570–1583.PubMed
go back to reference Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms activated by imagery and memory: Eye movements to absent objects. Psychological Research, 65(4), 235–241.PubMed Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms activated by imagery and memory: Eye movements to absent objects. Psychological Research, 65(4), 235–241.PubMed
go back to reference Stokes, M., Thompson, R., Cusack, R., & Duncan, J. (2009). Top-down activation of shape-specific population codes in visual cortex during mental imagery. Journal of Neuroscience, 29(5), 1565–1572.PubMed Stokes, M., Thompson, R., Cusack, R., & Duncan, J. (2009). Top-down activation of shape-specific population codes in visual cortex during mental imagery. Journal of Neuroscience, 29(5), 1565–1572.PubMed
go back to reference Tatler, B. W., Baddeley, R. J., & Gilchrist, I. D. (2005). Visual correlates of fixation selection: Effects of scale and time. Vision Research, 45(5), 643–659.PubMed Tatler, B. W., Baddeley, R. J., & Gilchrist, I. D. (2005). Visual correlates of fixation selection: Effects of scale and time. Vision Research, 45(5), 643–659.PubMed
go back to reference Wantz, A. L., Martarelli, C. S., & Mast, F. W. (2016). When looking back to nothing goes back to nothing. Cognitive Processing, 17(1), 105–114.PubMed Wantz, A. L., Martarelli, C. S., & Mast, F. W. (2016). When looking back to nothing goes back to nothing. Cognitive Processing, 17(1), 105–114.PubMed
go back to reference Willenbockel, V., Sadr, J., Fiset, D., Horne, G. O., Gosselin, F., & Tanaka, J. W. (2010). Controlling low-level image properties: The SHINE toolbox. Behavior and Research Methods, 42(3), 671–684. Willenbockel, V., Sadr, J., Fiset, D., Horne, G. O., Gosselin, F., & Tanaka, J. W. (2010). Controlling low-level image properties: The SHINE toolbox. Behavior and Research Methods, 42(3), 671–684.
go back to reference Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press. Yarbus, A. L. (1967). Eye Movements and Vision. Plenum Press.
Metagegevens
Titel
Pictorial low-level features in mental images: evidence from eye fixations
Auteurs
Corinna S. Martarelli
Fred W. Mast
Publicatiedatum
22-03-2021
Uitgeverij
Springer Berlin Heidelberg
Gepubliceerd in
Psychological Research / Uitgave 2/2022
Print ISSN: 0340-0727
Elektronisch ISSN: 1430-2772
DOI
https://doi.org/10.1007/s00426-021-01497-3

Andere artikelen Uitgave 2/2022

Psychological Research 2/2022 Naar de uitgave