skip to main content
article

Foreground and background interaction with sensor-enhanced mobile devices

Published:01 March 2005Publication History
Skip Abstract Section

Abstract

Building on Buxton's foreground/background model, we discuss the importance of explicitly considering both foreground interaction and background interaction, as well as transitions between foreground and background, in the design and implementation of sensing techniques for sensor-enhanced mobile devices. Our view is that the foreground concerns deliberate user activity where the user is attending to the device, while the background is the realm of inattention or split attention, using naturally occurring user activity as an input that allows the device to infer or anticipate user needs. The five questions for sensing systems of Bellotti et al. [2002] proposed as a framework for this special issue, primarily address the foreground, but neglect critical issues with background sensing. To support our perspective, we discuss a variety of foreground and background sensing techniques that we have implemented for sensor-enhanced mobile devices, such as powering on the device when the user picks it up, sensing when the user is holding the device to his ear, automatically switching between portrait and landscape display orientations depending on how the user is holding the device, and scrolling the display using tilt. We also contribute system architecture issues, such as using the foreground/background model to handle cross-talk between multiple sensor-based interaction techniques, and theoretical perspectives, such as a classification of recognition errors based on explicitly considering transitions between the foreground and background. Based on our experiences, we propose design issues and lessons learned for foreground/background sensing systems.

References

  1. Bartlett, J. F. 2000. Rock 'n' scroll is here to stay. IEEE Comput. Graph. Appl. (May/June): 40--45. Google ScholarGoogle Scholar
  2. Baxter, L. K. 1997. Capacitive Sensors: Design and Applications. New York, The Institute of Electrical and Electronics Engineers.Google ScholarGoogle Scholar
  3. Bellotti, V., Back, M., Edwards, W. K., Grinter, R., Lopes, C., and Henderson, A. 2002. Making sense of sensing systems: Five questions for designers and researchers. In Proceedings of the ACM CHI 2002 Conference on Human Factors in Computing Systems, Minneapolis, MN, 415--422. Google ScholarGoogle Scholar
  4. Buxton, W. 1995. Integrating the periphery and context: A new taxonomy of telematics. Proceedings of Graphics Interface '95, Quebec City, Quebec, Canada, 239--246.Google ScholarGoogle Scholar
  5. Dietz, P. and Yerazunis, W. 2001. Real-time audio buffering for telephone applications. In Proceedings of the ACM UIST 2001 Symposium on User Interface Software & Technology, Orlando, FL, 193--194. Google ScholarGoogle Scholar
  6. Dix, A. 2002. Beyond intention: pushing boundaries with incidental interaction. In Proceedings of Building Bridges: Interdisciplinary Context-Sensitive Computing, Glasgow University, 1--6.Google ScholarGoogle Scholar
  7. Harrison, B., Fishkin, K., Gujar, A., Mochon, C., and Want, R. 1998. Squeeze Me, Hold Me, Tilt Me! An exploration of manipulative user interfaces. In Proceedings of the ACM CHI'98 Conference on Human Factors in Computing Systems, Los Angeles, CA, 17--24. Google ScholarGoogle Scholar
  8. Hinckley, K. 2003a. Distributed and local sensing techniques for face-to-face collaboration. ICMI-PUI'03 Fifth International Conference on Multimodal Interfaces, Vancouver B.C., Canada, 81--84. Google ScholarGoogle Scholar
  9. Hinckley, K. 2003b. Synchronous gestures for multiple users and computers. UIST'03 Symposium on User Interface Software & Technology, Vancouver, BC, Canada, 149--158. Google ScholarGoogle Scholar
  10. Hinckley, K. and Horvitz, E. 2001. Towards more sensitive mobile Phones. ACM UIST 2001 Symposium on User Interface Software & Technology, Orlando, FL, 191--192. Google ScholarGoogle Scholar
  11. Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E. 2000. Sensing techniques for mobile interaction. ACM UIST 2000 Symposium on User Interface Software & Technology, San Diego, CA, 91--100. Google ScholarGoogle Scholar
  12. Holmquist, L., Mattern, F., Schiele, B., Alahuhta, P., Beigl, M., and Gellersen, H. 2001. Smart-its friends: A technique for users to easily establish connections between smart artefacts. Ubicomp, Atlanta, GA, Springer-Verlag, 116--122. Google ScholarGoogle Scholar
  13. Horvitz, E. 1999. Principles of mixed-initiative user interfaces. In Proceedings of the ACM CHI'99 Conference on Human Factors in Computing Systems, Pittsburgh, PA, 159--166. Google ScholarGoogle Scholar
  14. Horvitz, E., Jacobs, A., and Hovel, D. 1999. Attention-sensitive alerting. In Proceedings of UAI '99, Conference on Uncertainty and Artificial Intelligence, Stockholm, Sweden, 305--313. Google ScholarGoogle Scholar
  15. Ishii, H. and Ullmer, B. 1997. Tangible bits: Towards seamless interfaces between people, bits, and atoms. Proceedings of CHI'97: ACM Conference on Human Factors in Computing Systems, Atlanta, Georgia, ACM, New York, 234--241. Google ScholarGoogle Scholar
  16. Nielsen, J. 1993. Noncommand user interfaces. Comm. ACM 36 (4): 83--89. Google ScholarGoogle Scholar
  17. Norman, D. A. 1981. Categorization of action slips. Psyc. Rev. 88 (1): 1--15.Google ScholarGoogle Scholar
  18. Partridge, K., Chatterjee, S., Sazawal, V., Borriello, G., and Want, R. 2002. TiltType: Accelerometer-supported text entry for very small devices. ACM UIST 2002 Symposium on User Interface Software and Technology, Paris, France. Google ScholarGoogle Scholar
  19. Rekimoto, J. 1996. Tilting operations for small screen interfaces. ACM UIST'96 Symposium on User Interface Sofware & Technology, Seattle, WA, 167--168. Google ScholarGoogle Scholar
  20. Rekimoto, J. 1997. Pick-and-drop: A direct manipulation technique for multiple computer environments. Proceedings of the ACM UIST'97 Symposium on User Interface Software & Technology, Banff, Alberta, Canada, 31--39. Google ScholarGoogle Scholar
  21. Saffo, P. 1997. Sensors: The next wave of infotech innovation. Institute for the Future: 1997 Ten-Year Forecast, 115--122.Google ScholarGoogle Scholar
  22. Schilit, B. N., Adams, N. I., and Want, R. 1994. Context-aware computing applications. Proceedings of the IEEE Workshop on Mobile Computing Systems and Applications, Santa Cruz, CA, IEEE Computer Society, 85--90.Google ScholarGoogle Scholar
  23. Schmidt, A. 2000. Implicit human-computer interaction through context. Personal Technologies 4 (2&3): 191--199.Google ScholarGoogle Scholar
  24. Schmidt, A., Beigl, M., and Gellersen, H.-W. 1999. There is more to context than location. Comput. Graph. 23 (6): 893--901.Google ScholarGoogle Scholar
  25. Sellen, A., Kurtenbach, G., and Buxton, W. 1992. The Prevention of mode errors through sensory feedback. Hum. Comput. Inter. 7 (2): 141--164.Google ScholarGoogle Scholar
  26. Small, D. and Ishii, H. 1997. Design of spatially aware graspable displays. CHI'97 Conference Companion, Altanta, GA, 367--368. Google ScholarGoogle Scholar
  27. Want, R., Fishkin, K. P., Gujar, A., and Harrison, B. L. 1999. Bridging physical and virtual worlds with electronic tags. Proceedings of the ACM CHI'99 Conference on Human Factors in Computing Systems, Pittsburgh, PA, 370--377. Google ScholarGoogle Scholar
  28. Wigdor, D. and Balakrishnan, R. 2003. TiltText:Using tilt for text input to mobile phones. ACM UIST'03 Symposium on User Interface Software & Technology, Vancouver, BC, Canada, 81--90. Google ScholarGoogle Scholar

Index Terms

  1. Foreground and background interaction with sensor-enhanced mobile devices

          Recommendations

          Reviews

          John J. Hirschfelder

          This paper reports the authors' research on sensor-enhanced handheld computing devices, typically personal digital assistants (PDAs). The authors outfitted a PDA with a touch sensor, a two-axis linear accelerometer, a gravity switch, and an infrared proximity sensor. The sensors can detect when the user is near to the device, or is holding or tilting it. Sensor outputs are then used to move responsibility for some actions and decisions from the user to the system. For example, the unit turns itself on when it is picked up, and switches its display between portrait and landscape when the unit is rotated. The objective of the research was to identify general principles and lessons learned that could be considered in the design of sensor-enhanced products. The paper begins with a general discussion of the foreground/background dichotomy. Foreground refers to direct user attention to the device. Background refers to processing in which the user's intent is inferred from naturally occurring gestures relating to the device. This discussion is followed by descriptions of each of the background processing capabilities developed, and the lessons learned from the development process and usability testing. A significant contribution of this work is the message processing architecture used to control foreground/background transitions, and application access to sensor data. The issue addressed is cross talk between multiple techniques for interpreting sensor data; the authors constructed a unique client-server message passing system to address this issue. The architecture is described in some detail, including a list of the message types, and their relation to ground transitions. Online Computing Reviews Service

          Access critical reviews of Computing literature here

          Become a reviewer for Computing Reviews.

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in

          Full Access

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader