skip to main content
10.1145/2993901.2993905acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Supporting Exploration of Eye Tracking Data: Identifying Changing Behaviour Over Long Durations

Published:24 October 2016Publication History

ABSTRACT

Visual analytics of eye tracking data is a common tool for evaluation studies across diverse fields. In this position paper we propose a novel user-driven interactive data exploration tool for understanding the characteristics of eye gaze movements and the changes in these behaviours over time. Eye tracking experiments generate multidimensional scan path data with sequential information. Many mathematical methods in the past have analysed one or a few of the attributes of the scan path data and derived attributes such as Area of Interest (AoI), statistical measures, geometry, domain specific features etc. In our work we are interested in visual analytics of one of the derived attributes of sequential data-the: AoI and the sequences of visits to these AoIs over time. In the case of static stimuli, such as images, or dynamic stimuli, like videos, having predefined or fixed AoIs is not an efficient way of analysing scan path patterns. The AoI of a user over a stimulus may evolve over time and hence determining the AoIs dynamically through temporal clustering could be a better method for analysing the eye gaze patterns. In this work we primarily focus on the challenges in analysis and visualization of the temporal evolution of AoIs. This paper discusses the existing methods, their shortcomings and scope for improvement by adopting visual analytics methods for event-based temporal data to the analysis of eye tracking data.

References

  1. Tobii studio users manual. Technical Report Version 3.4.5, Tobii AB, January 2016.Google ScholarGoogle Scholar
  2. N. C. Anderson, F. Anderson, A. Kingstone, and W. F. Bischof. A comparison of scanpath comparison methods. Behavior research methods, 47(4):1377--1392, 2015.Google ScholarGoogle Scholar
  3. G. Andrienko, N. Andrienko, M. Burch, and D. Weiskopf. Visual analytics methodology for eye movement studies. Visualization and Computer Graphics, IEEE Transactions on, 18(12):2889--2898, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. State-of-the-art of visualization for eye tracking data. In Proceedings of EuroVis, volume 2014, 2014.Google ScholarGoogle Scholar
  5. T. Blascheck, K. Kurzhals, M. Raschke, S. Strohmaier, D. Weiskopf, and T. Ertl. Aoi hierarchies for visual exploration of fixation sequences. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pages 111--118. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. T. Blascheck, M. Raschke, and T. Ertl. Circular heat map transition diagram. In Proceedings of the 2013 Conference on Eye Tracking South Africa, pages 58--61. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. A. Brandt and L. W. Stark. Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of cognitive neuroscience, 9(1):27--38, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. Burch, A. Kull, and D. Weiskopf. Aoi rivers for visualizing dynamic eye gaze frequencies. In Computer Graphics Forum, volume 32, pages 281--290. Wiley Online Library, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Comaniciu and P. Meer. Mean shift: A robust approach toward feature space analysis. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 24(5):603--619, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. R. Dewhurst, M. Nyström, H. Jarodzka, T. Foulsham, R. Johansson, and K. Holmqvist. It depends on how you look at it: Scanpath comparison in multiple dimensions with multimatch, a vector-based approach. Behavior research methods, 44(4):1079--1100, 2012.Google ScholarGoogle Scholar
  11. G. Drusch, J. Bastien, and S. Paris. Analysing eye-tracking data: From scanpaths and heatmaps to the dynamic visualisation of areas of interest. Advances in Science, Technology, Higher Education and Society in the Conceptual Age: STHESCA, 20:205, 2014.Google ScholarGoogle Scholar
  12. A. T. Duchowski. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers, 34(4):455--470, 2002.Google ScholarGoogle Scholar
  13. T. R. Hayes, A. A. Petrov, and P. B. Sederberg. A novel method for analyzing sequential eye movements reveals strategic influence on raven's advanced progressive matrices. Journal of Vision, 11(10):10--10, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  14. A. K. Jain, M. N. Murty, and P. J. Flynn. Data clustering: a review. ACM computing surveys (CSUR), 31(3):264--323, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Z. Kang and S. J. Landry. An eye movement analysis algorithm for a multielement target tracking task: Maximum transition-based agglomerative hierarchical clustering. Human-Machine Systems, IEEE Transactions on, 45(1):13--24, 2015.Google ScholarGoogle Scholar
  16. E. Kasneci, G. Kasneci, T. C. Kübler, and W. Rosenstiel. The applicability of probabilistic methods to the online recognition of fixations and saccades in dynamic scenes. In Proceedings of the Symposium on Eye Tracking Research and Applications, pages 323--326. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. S. Kotsiantis and P. Pintelas. Recent advances in clustering: A brief survey. WSEAS Transactions on Information Science and Applications, 1(1):73--81, 2004.Google ScholarGoogle Scholar
  18. K. Krejtz, A. Duchowski, T. Szmidt, I. Krejtz, F. González Perilli, A. Pires, A. Vilaro, and N. Villalobos. Gaze transition entropy. ACM Transactions on Applied Perception (TAP), 13(1):4, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. K. Krejtz, T. Szmidt, A. T. Duchowski, and I. Krejtz. Entropy-based statistical analysis of eye movement transitions. In Proceedings of the Symposium on Eye Tracking Research and Applications, pages 159--166. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. T. C. Kübler, E. Kasneci, and W. Rosenstiel. Subsmatch: Scanpath similarity in dynamic scenes based on subsequence frequencies. In Proceedings of the Symposium on Eye Tracking Research and Applications, pages 319--322. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. T. C. Kübler, K. Sippel, W. Fuhl, G. Schievelbein, J. Aufreiter, R. Rosenberg, W. Rosenstiel, and E. Kasneci. Analysis of eye movements with eyetrace. In Biomedical Engineering Systems and Technologies, pages 458--471. Springer, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  22. K. Kurzhals, C. F. Bopp, J. Bässler, F. Ebinger, and D. Weiskopf. Benchmark data for evaluating visualization and analysis techniques for eye tracking for video stimuli. In Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, pages 54--60. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. K. Kurzhals, B. Fisher, M. Burch, and D. Weiskopf. Eye tracking evaluation of visual analytics. Information Visualization, page 1473871615609787, 2015.Google ScholarGoogle Scholar
  24. K. Kurzhals, F. Heimerl, and D. Weiskopf. Iseecube: Visual analysis of gaze data for video. In Proceedings of the Symposium on Eye Tracking Research and Applications, pages 43--50. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. K. Kurzhals and D. Weiskopf. Space-time visual analytics of eye-tracking data for dynamic stimuli. Visualization and Computer Graphics, IEEE Transactions on, 19(12):2129--2138, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. K. Kurzhals and D. Weiskopf. Aoi transition trees. In Proceedings of the 41st Graphics Interface Conference, pages 41--48. Canadian Information Processing Society, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. T. Lindeberg. Scale-space theory in computer vision, volume 256. Springer Science & Business Media, 2013.Google ScholarGoogle Scholar
  28. I. Peca, G. Fuchs, K. Vrotsou, N. Andrienko, and G. Andrienko. Scalable cluster analysis of spatial events. In Proceedings of International Workshop on Visual Analytics (EuroVA 2012), pages 19--23, 2012.Google ScholarGoogle Scholar
  29. D. C. Richardson and R. Dale. Looking to understand: The coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension. Cognitive science, 29(6):1045--1060, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  30. S. M. Ross et al. Stochastic processes, volume 2. John Wiley & Sons New York, 1996.Google ScholarGoogle Scholar
  31. A. Santella and D. DeCarlo. Robust clustering of eye movement recordings for quantification of visual interest. In Proceedings of the 2004 symposium on Eye tracking research & applications, pages 27--34. ACM, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Å. Svensson. Air traffic controller's work-pattern during air traffic control tower simulations: A eye-tracking study of air traffic controller's eye-movements during arrivals. Master's thesis, Linköping University, Sweden, 2015.Google ScholarGoogle Scholar
  33. E. Tafaj, G. Kasneci, W. Rosenstiel, and M. Bogdan. Bayesian online clustering of eye movement data. In Proceedings of the Symposium on Eye Tracking Research and Applications, pages 285--288. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. H. Y. Tsang, M. Tory, and C. Swindells. eseetrack -visualizing sequential fixation patterns. Visualization and Computer Graphics, IEEE Transactions on, 16(6):953--962, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. K. Vrotsou, J. Johansson, and M. Cooper. Activitree: interactive visual exploration of sequences in event-based data using graph similarity. Visualization and Computer Graphics, IEEE Transactions on, 15(6):945--952, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. S. Winkler and S. Ramanathan. Overview of eye tracking datasets. In QoMEX, pages 212--217, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  37. S. Winkler, F. M. Savoy, and R. Subramanian. X-eye: A reference format for eye tracking data to facilitate analyses across databases. In IS&T/SPIE Electronic Imaging, pages 90140L--90140L. International Society for Optics and Photonics, 2014.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    BELIV '16: Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization
    October 2016
    177 pages
    ISBN:9781450348188
    DOI:10.1145/2993901

    Copyright © 2016 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 24 October 2016

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate45of64submissions,70%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader