ABSTRACT
We can touch things, and our senses tell us when our hands are touching something. But most computer input devices cannot detect when the user touches or releases the device or some portion of the device. Thus, adding touch sensors to input devices offers many possibilities for novel interaction techniques. We demonstrate the TouchTrackball and the Scrolling TouchMouse, which use unobtrusive capacitance sensors to detect contact from the users hand without requiring pressure or mechanical actuation of a switch. We further demonstrate how the capabilities of these devices can be matched to an implicit interaction technique, the On-Demand Interface, which uses the passive information captured by touch sensors to fade in or fade out portions of a display depending on what the user is doing; a second technique uses explicit, intentional interaction with touch sensors for enhanced scrolling. We present our new devices in the context of a simple tax- onomy of tactile input technologies. Finally, we discuss the properties of touch-sensing as an input channel in general.
- 1.Balakrishnan, R., Patel, P., "The PadMouse: Facilitating Selection and Spatial Positioning for the Non-Dominant Hand," CHr98, 1998, 9-16. Google ScholarDigital Library
- 2.Bier, E., Stone, M., Pier, K., Buxton, W., DeRose, T., "Toolglass and Magic Lenses: The See- Through Interface," SIGGRAPH 93, 1993, 73-80. Google ScholarDigital Library
- 3.Buxton, W., "Touch, Gesture, and Marking," in Readings in Human-Computer Interaction: Toward the Year 2000, R. Baecker, et al., Editors. 1995, Morgan Kaufmann Publishers. p. 469-482. Google ScholarDigital Library
- 4.Buxton, W., Hill, R., Rowley, P., "Issues and Techniques in Touch-Sensitive Tablet Input," Computer Graphics, 19 (3): p. 215-224, 1985. Google ScholarDigital Library
- 5.Card, S., Mackinlay, J., Robertson, G., "The Design Space of input Devices," CHr90 Conf. on Human Factors in Computing Systems, 117-124. Google ScholarDigital Library
- 6.Chemtronics, CircuitWorks Conductive Pen, : http ://www.chemtronics.corn/.Google Scholar
- 7.Harrison, B., Fishkin, K., Gujar, A., Mochon, C., Want, R., "Squeeze Me, Hold Me, Tilt Me! An Exploration of Manipulative User Interfaces," CHI'98, 17-24. Google ScholarDigital Library
- 8.Harrison & Vicente, "An Experimental Evaluation of Transparent Menu Usage," CHI'96, 391-398. Google ScholarDigital Library
- 9.Herot, C., Weinzapfel, G., "One-Point Touch Input of Vector Information from Computer Displays," Computer Graphics, 12 (3): p. 210-216, 1978. Google ScholarDigital Library
- 10.Hinckley, K., Czerwinski, M., Sinclair, M., "Interaction and Modeling Techniques for Desktop Two-Handed input," UIST'98, 49-58. Google ScholarDigital Library
- 11.Infusion Systems, "Reach" electromagnetic field sensor" http:llwww.infusionsystems.com/.Google Scholar
- 12.ITU Research, TouchCube: www.ituresearch.com.Google Scholar
- 13.Krueger, M., "Artificial Reality I}{". 1991, Reading, MA: Addison-Wesley.Google Scholar
- 14.Kurtenbach, G., Fitzmaurice, G., Baudel, T., Buxton, B., "The Design of a GUI Paradigm based on Tablets, Two-hands, and Tran:sparency," CHr97, 35-42. Google ScholarDigital Library
- 15.Lee, S., Buxton, W., Smith, K., "A Multi-Touch Three Dimensional Touch-Sensitive Tablet," Proc. CHI'85, 1985, 21-25. Google ScholarDigital Library
- 16.Loomis, J., Lederman, S., "Tactual Perception," in Handbook of Perception and Human Performance: Vol. H, K. Boff et al., eds. 1986, John Wiley and Sons: New York. Chapter 31.Google Scholar
- 17.Mapes, D., Moshell, J.M., "A Two-Handed Interface for Object Manipulation in Virtual Environments," Presence, 4 (4): p. 403-416, 1995.Google ScholarDigital Library
- 18.Matsushita, N., Rekimoto, J., "Holo Wall" Designing a Finger, Hand, Body, and Object Sensitive Wall," UIST'97, 209-210. Google ScholarDigital Library
- 19.Media Metrix Inc., HardScan Report, 1998, p. 4.Google Scholar
- 20.Pickering, J., "Touch-sensitive screens: the technologies and their application," International j. Man-Machine Studies, 25 (3): p. 249-69, 1986. Google ScholarDigital Library
- 21.Rouse, P., "Touch-sensitive joystick," Radio & Electronics World, Feb. 1985, p. 23-26.Google Scholar
- 22.Ruspini, D., Kolarov, K., Khatib, O., "q~e Haptic Display of Complex Graphical Environments," SIGGRAPH'97,345-352. Google ScholarDigital Library
- 23.Sinclair, M., "The Haptic Lens," SIGRRAPH'97 Visual Proceedings, p. 179. Google ScholarDigital Library
- 24.Smith, J.R., White, T., Dodge, C., Allport, D., Paradiso, J., Gershenfeld, N., "Electric Field Sensing for Graphical Interfaces," IEEE Computer Graphics and Applications, May, 1998. Google ScholarDigital Library
- 25.Zhai, S., Smith, B.A., Selker, T., "Improving Browsing Performance" A study of four input devices for scrolling and pointing tasks," Proc. Interact '97: The 6th IFIP Conf. on HCI, 286-92. Google ScholarDigital Library
- 26.Zimmerman, T., Smith, J, Paradiso, J., Allport, D., Gershenfeld, N., "Applying Electric Field Sensing to Human-Computer Interfaces," CHI'95,280-287. Google ScholarDigital Library
Index Terms
- Touch-sensing input devices
Recommendations
A study on touch & hover based interaction for zooming
CHI EA '12: CHI '12 Extended Abstracts on Human Factors in Computing SystemsProximity is a useful medium for interaction with high interactive digital contents. It can be used in different contexts such as for navigation through depth in 3D space in zoomable interfaces. In this paper, we propose hover-based zoom interaction as ...
Back-of-device interaction allows creating very small touch devices
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsIn this paper, we explore how to add pointing input capabilities to very small screen devices. On first sight, touchscreens seem to allow for particular compactness, because they integrate input and screen into the same physical space. The opposite is ...
Air+touch: interweaving touch & in-air gestures
UIST '14: Proceedings of the 27th annual ACM symposium on User interface software and technologyWe present Air+Touch, a new class of interactions that interweave touch events with in-air gestures, offering a unified input modality with expressiveness greater than each input modality alone. We demonstrate how air and touch are highly complementary: ...
Comments