skip to main content
10.1145/3543758.3543766acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmundcConference Proceedingsconference-collections
research-article
Open Access

The Gesture Authoring Space: Authoring Customised Hand Gestures for Grasping Virtual Objects in Immersive Virtual Environments

Authors Info & Claims
Published:15 September 2022Publication History

ABSTRACT

Natural user interfaces are on the rise. Manufacturers for Augmented, Virtual, and Mixed Reality head mounted displays are increasingly integrating new sensors into their consumer grade products, allowing gesture recognition without additional hardware. This offers new possibilities for bare handed interaction within virtual environments. This work proposes a hand gesture authoring tool for object specific grab gestures allowing virtual objects to be grabbed as in the real world. The presented solution uses template matching for gesture recognition and requires no technical knowledge to design and create custom tailored hand gestures. In a user study, the proposed approach is compared with the pinch gesture and the controller for grasping virtual objects. The different grasping techniques are compared in terms of accuracy, task completion time, usability, and naturalness. The study showed that gestures created with the proposed approach are perceived by users as a more natural input modality than the others.

Skip Supplemental Material Section

Supplemental Material

GestureAuthoringSpace_Supplementary.mp4

mp4

6.7 MB

References

  1. William Albert and Thomas Tullis. 2013. Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes, USA.Google ScholarGoogle Scholar
  2. Daniel Ashbrook and Thad Starner. 2010. MAGIC: A Motion Gesture Design Tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI ’10). Association for Computing Machinery, New York, NY, USA, 2159–2168. https://doi.org/10.1145/1753326.1753653Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Poonpong Boonbrahm and Charlee Kaewrat. 2014. Assembly of the Virtual Model with Real Hands Using Augmented Reality Technology. In Virtual, Augmented and Mixed Reality. Designing and Developing Virtual and Augmented Environments, Randall Shumaker and Stephanie Lackey (Eds.). Springer International Publishing, Cham, 329–338.Google ScholarGoogle Scholar
  4. John Brooke. 1996. SUS: a “quick and dirty’usability. Usability evaluation in industry 1, 1 (1996), 189.Google ScholarGoogle Scholar
  5. John Brooke. 2013. SUS: a retrospective. Journal of usability studies 8, 2 (2013), 29–40.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Volkert Buchmann, Stephen Violich, Mark Billinghurst, and Andy Cockburn. 2004. FingARtips: Gesture Based Direct Manipulation in Augmented Reality. In Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia (Singapore) (GRAPHITE ’04). Association for Computing Machinery, New York, NY, USA, 212–221. https://doi.org/10.1145/988834.988871Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Giuseppe Caggianese, Luigi Gallo, and Pietro Neroni. 2019. The Vive Controllers vs. Leap Motion for Interactions in Virtual Environments: A Comparative Evaluation. In Intelligent Interactive Multimedia Systems and Services, Giuseppe De Pietro, Luigi Gallo, Robert J. Howlett, Lakhmi C. Jain, and Ljubo Vlacic (Eds.). Springer International Publishing, Cham, 24–33.Google ScholarGoogle Scholar
  8. Muhammad Zahid Iqbal, Eleni Mangina, and Abraham G. Campbell. 2021. Exploring the Real-Time Touchless Hand Interaction and Intelligent Agents in Augmented Reality Learning Applications. In 2021 7th International Conference of the Immersive Learning Research Network (iLRN). IEEE, Eureka, CA, USA, 1–8. https://doi.org/10.23919/iLRN52045.2021.9459415Google ScholarGoogle Scholar
  9. Hyo Jeong Kang, Jung-hye Shin, and Kevin Ponto. 2020. A Comparative Analysis of 3D User Interaction: How to Move Virtual Objects in Mixed Reality. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, Atlanta, GA, USA, 275–284. https://doi.org/10.1109/VR46266.2020.00047Google ScholarGoogle ScholarCross RefCross Ref
  10. Chaowanan Khundam, Varunyu Vorachart, Patibut Preeyawongsakul, Witthaya Hosap, and Frédéric Noël. 2021. A Comparative Study of Interaction Time and Usability of Using Controllers and Hand Tracking in Virtual Reality Training. Informatics 8, 3 (2021), 1. https://doi.org/10.3390/informatics8030060Google ScholarGoogle ScholarCross RefCross Ref
  11. Guangchuan Li, David Rempel, Yue Liu, Weitao Song, and Carisa Harris Adamson. 2021. Design of 3D Microgestures for Commands in Virtual Reality or Augmented Reality. Applied Sciences 11, 14 (2021), 6375.Google ScholarGoogle ScholarCross RefCross Ref
  12. Hao Lü and Yang Li. 2012. Gesture Coder: A Tool for Programming Multi-Touch Gestures by Demonstration. Association for Computing Machinery, New York, NY, USA, 2875–2884. https://doi.org/10.1145/2207676.2208693Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Nathan Magrofuoco, Paolo Roselli, Jean Vanderdonckt, Jorge Luis Pérez-Medina, and Radu-Daniel Vatavu. 2019. GestMan: A Cloud-Based Tool for Stroke-Gesture Datasets. In Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems (Valencia, Spain) (EICS ’19). Association for Computing Machinery, New York, NY, USA, Article 7, 6 pages. https://doi.org/10.1145/3319499.3328227Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Alexander Masurovsky, Paul Chojecki, Detlef Runde, Mustafa Lafci, David Przewozny, and Michael Gaebler. 2020. Controller-Free Hand Tracking for Grab-and-Place Tasks in Immersive Virtual Reality: Design Elements and Their Empirical Study. Multimodal Technologies and Interaction 4, 4 (2020), 1. https://doi.org/10.3390/mti4040091Google ScholarGoogle ScholarCross RefCross Ref
  15. George B. Mo, John J Dudley, and Per Ola Kristensson. 2021. Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 291, 13 pages. https://doi.org/10.1145/3411764.3445766Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Siyi Mu and Alexei Sourin. 2021. Virtual Assembling Using Hand Tracking with Leap Motion Controller. In 2021 International Conference on Cyberworlds (CW). IEEE, Caen, France, 121–124. https://doi.org/10.1109/CW52790.2021.00026Google ScholarGoogle Scholar
  17. Michael Nebeling, David Ott, and Moira C. Norrie. 2015. Kinect Analysis: A System for Recording, Analysing and Sharing Multimodal Interaction Elicitation Studies. In Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (Duisburg, Germany) (EICS ’15). Association for Computing Machinery, New York, NY, USA, 142–151. https://doi.org/10.1145/2774225.2774846Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Richard Nguyen. 2022. Integrating In-Hand Physical Objects in Mixed Reality Interactions. In 27th International Conference on Intelligent User Interfaces (Helsinki, Finland) (IUI ’22 Companion). Association for Computing Machinery, New York, NY, USA, 129–133. https://doi.org/10.1145/3490100.3516476Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Patrick Aggergaard Olin, Ahmad Mohammad Issa, Tiare Feuchtner, and Kaj Grønbæk. 2020. Designing for Heterogeneous Cross-Device Collaboration and Social Interaction in Virtual Reality. In 32nd Australian Conference on Human-Computer Interaction (Sydney, NSW, Australia) (OzCHI ’20). Association for Computing Machinery, New York, NY, USA, 112–127. https://doi.org/10.1145/3441000.3441070Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Siyou Pei, Alexander Chen, Jaewook Lee, and Yang Zhang. 2022. Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 429, 16 pages. https://doi.org/10.1145/3491102.3501898Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Jeff Sauro. 2011. A practical guide to the system usability scale: Background, benchmarks & best practices. Measuring Usability LLC, Book.Google ScholarGoogle Scholar
  22. Alexander Schäfer, Gerd Reis, and Didier Stricker. 2022. Comparing Controller with the Hand Gestures Pinch and Grab for Picking Up and Placing Virtual Objects. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, IEEE, Christchurch, New Zealand, 738–739. https://doi.org/10.1109/VRW55335.2022.00220Google ScholarGoogle Scholar
  23. Alexander Schäfer, Gerd Reis, and Didier Stricker. 2022. A Survey on Synchronous Augmented, Virtual and Mixed Reality Remote Collaboration Systems. ACM Comput. Surv. 1, 1 (apr 2022), 23. https://doi.org/10.1145/3533376 Just Accepted.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Alexander Schäfer, Gerd Reis, and Didier Stricker. 2022. AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality Applications. Applied Sciences 12, 4 (2022), 21. https://doi.org/10.3390/app12041888Google ScholarGoogle ScholarCross RefCross Ref
  25. Jungpil Shin, Akitaka Matsuoka, Md Hasan, Al Mehedi, and Azmain Yakin Srizon. 2021. American Sign Language Alphabet Recognition by Extracting Feature from Hand Pose Estimation. Sensors 21, 17 (2021), 5856.Google ScholarGoogle ScholarCross RefCross Ref
  26. Peng Song, Wooi Boon Goh, William Hutama, Chi-Wing Fu, and Xiaopei Liu. 2012. A Handle Bar Metaphor for Virtual Object Manipulation with Mid-Air Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI ’12). Association for Computing Machinery, New York, NY, USA, 1297–1306. https://doi.org/10.1145/2207676.2208585Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Suzanne Sorli, Dan Casas, Mickeal Verschoor, Ana Tajadura-Jiménez, and Miguel A. Otaduy. 2021. Fine Virtual Manipulation with Hands of Different Sizes. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, Bari, Italy, 304–309. https://doi.org/10.1109/ISMAR52148.2021.00046Google ScholarGoogle Scholar
  28. Maximilian Speicher and Michael Nebeling. 2018. GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–11. https://doi.org/10.1145/3173574.3173681Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Sota Suzuki, Haruto Suzuki, and Mie Sato. 2014. Grasping a Virtual Object with a Bare Hand. In ACM SIGGRAPH 2014 Posters (Vancouver, Canada) (SIGGRAPH ’14). Association for Computing Machinery, New York, NY, USA, Article 51, 1 pages. https://doi.org/10.1145/2614217.2630574Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Theophilus Teo, Gun A. Lee, Mark Billinghurst, and Matt Adcock. 2018. Hand Gestures and Visual Annotation in Live 360 Panorama-Based Mixed Reality Remote Collaboration. In Proceedings of the 30th Australian Conference on Computer-Human Interaction (Melbourne, Australia) (OzCHI ’18). Association for Computing Machinery, New York, NY, USA, 406–410. https://doi.org/10.1145/3292147.3292200Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Radu-Daniel Vatavu and Laura-Bianca Bilius. 2021. GestuRING: A Web-Based Tool for Designing Gesture Input with Rings, Ring-Like, and Ring-Ready Devices. In The 34th Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST ’21). Association for Computing Machinery, New York, NY, USA, 710–723. https://doi.org/10.1145/3472749.3474780Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Spyros Vosinakis and Panayiotis Koutsabasis. 2018. Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift. Virtual Reality 22, 1 (2018), 47–62.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. The Gesture Authoring Space: Authoring Customised Hand Gestures for Grasping Virtual Objects in Immersive Virtual Environments

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format