ABSTRACT
Multimodal interaction offers many potential benefits for data visualization. It can help people stay in the flow of their visual analysis and presentation, with the strengths of one interaction modality offsetting the weaknesses of others. Furthermore, multimodal interaction offers strong promise for leveraging data visualization on diverse display hardware including mobile, AR/VR, and large displays. However, prior research on visualization and interaction techniques has mostly explored a single input modality such as mouse, touch, pen, or more recently, natural language. The unique challenges and opportunities of synergistic multimodal interaction for data visualization have yet to be investigated. This workshop will bring together researchers with expertise in visualization, interaction design, and natural user interfaces. We aim to build a community of researchers focusing on multimodal interaction for data visualization, explore opportunities and challenges in our research, and establish an agenda for multimodal interaction research specifically for data visualization.
- S. K. Badam, F. Amini, N. Elmqvist, and P. Irani. 2016. Supporting visual exploration for multiple users in large display environments. In 2016 IEEE Conference on Visual Analytics Science and Technology (VAST). 1--10.Google Scholar
- Sabine Cassat, Marcos Serrano, Emmanuel Dubois, and Pourang Irani. 2018. A Novel Interaction Paradigm For Exploring Spatio-Temporal Data. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_146.pdfGoogle Scholar
- Mohammad Chegini, Lin Shao, Keith Andrews, and Tobias Schreck. 2018. Toward Multimodal Interaction of Scatterplot Spaces Exploration. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_149.pdfGoogle Scholar
- Tong Gao, Mira Dontcheva, Eytan Adar, Zhicheng Liu, and Karrie G Karahalios. 2015. Datatone: Managing ambiguity in natural language interfaces for data visualization. In Proceedings of the UIST. 489--500. Google ScholarDigital Library
- Rafael Henkin and Cagatay Turkay. 2018. Towards Multimodal Data Analytics: Integrating Natural Language into Visual Analytics. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May2018). https://multimodalvis.github.io/papers/AVI_2018_paper_152.pdfGoogle Scholar
- Enamul Hoque, Vidya Setlur, Melanie Tory, and Isaac Dykeman. 2018. Applying Pragmatics Principles for Interaction with Visual Analytics. IEEE transactions on visualization and computer graphics 24, 1 (2018), 309--318.Google Scholar
- Sebastian Hubenschmid, Simon Butscher, Johannes Zagermann, and Harald Reiterer. 2018. Employing Tangible Visualisations in Augmented Reality with Mobile Devices. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_150.pdfGoogle Scholar
- Radu Jianu. 2018. Gaze-Enabled Data Recommendations In Visualization: First Considerations. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_147.pdfGoogle Scholar
- Ulrike Kister, Patrick Reipschläger, Fabrice Matulic, and Raimund Dachselt. 2015. BodyLenses: Embodied Magic Lenses and Personal Territories for Wall Displays. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces (ITS '15). ACM, 117--126. Google ScholarDigital Library
- Eun Kyoung Choe, Bongshin Lee, and Seung-Won Hwang. 2018. Personal Data Exploration with Speech on Mobile Devices. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_159.pdfGoogle Scholar
- Ricardo Langner, Ulrike Kister, Marc Satkowski, and Raimund Dachselt. 2018. Combining Interactive Large Displays and Smartphones to Enable Data Analysis from Varying Distances. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_145.pdfGoogle Scholar
- Bongshin Lee, Petra Isenberg, Nathalie Henry Riche, and Sheelagh Carpendale. 2012. Beyond mouse and keyboard: Expanding design considerations for information visualization interactions. IEEE Transactions on Visualization and Computer Graphics 18, 12 (2012), 2689--2698. Google ScholarDigital Library
- Biswaksen Patnaik, Andrea Batch, and Niklas Elmqvist. 2018. Olfactory Analytics: Exploring the Design Space of Smell for Data Visualization. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May2018). https://multimodalvis.github.io/papers/AVI_2018_paper_148.pdfGoogle Scholar
- Alexander Rind, Michael Iber, and Wolfgang Aigner. 2018. Bridging the Gap Between Sonification and Visualization. AVI 2018 Workshop on Multimodal Interaction for Data Visualization (May 2018). https://multimodalvis.github.io/papers/AVI_2018_paper_155.pdfGoogle Scholar
- Ramik Sadana and John Stasko. 2016. Designing multiple coordinated visualizations for tablets. In Computer Graphics Forum, Vol. 35. Wiley Online Library, 261--270. Google ScholarDigital Library
- Vidya Setlur, Sarah E Battersby, Melanie Tory, Rich Gossweiler, and Angel X Chang. 2016. Eviza: A natural language interface for visual analysis. In Proceedings of the UIST. 365--377. Google ScholarDigital Library
- Arjun Srinivasan and John Stasko. 2017. Natural Language Interfaces for Data Analysis with Visualization: Considering What Has and Could Be Asked. EuroVis - Short Papers (Jun 2017), 55--59.Google Scholar
- Arjun Srinivasan and John Stasko. 2018. Orko: Facilitating Multimodal Interaction for Visual Exploration and Analysis of Networks. IEEE transactions on visualization and computer graphics 24, 1 (2018), 511--521.Google Scholar
- Jagoda Walny, Bongshin Lee, Paul Johns, Nathalie Henry Riche, and Sheelagh Carpendale. 2012. Understanding pen and touch interaction for data exploration on interactive whiteboards. IEEE Transactions on Visualization and Computer Graphics 18, 12 (2012), 2779--2788. Google ScholarDigital Library
Index Terms
- Multimodal interaction for data visualization
Recommendations
InChorus: Designing Consistent Multimodal Interactions for Data Visualization on Tablet Devices
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing SystemsWhile tablet devices are a promising platform for data visualization, supporting consistent interactions across different types of visualizations on tablets remains an open challenge. In this paper, we present multimodal interactions that function ...
Extending chatterbot system into multimodal interaction framework with embodied contextual understanding
HRI '12: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot InteractionThis work aims to realize multimodal interaction with embodied contextual understanding based on the simple chatterbot system. A system framework is proposed to integrate the dialogue system into a 3D simulation platform, SIGVerse to attain multimodal ...
Multimodal interaction: A suitable strategy for including older users?
The major promise of multimodal user interfaces for older users is that they have the choice to select the input modality (or combination of modalities) that best fits their needs and capabilities. Two studies investigated if multimodal interfaces with ...
Comments