skip to main content
10.1145/985692.985729acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Impact of interruption style on end-user debugging

Published:25 April 2004Publication History

ABSTRACT

Although researchers have begun to explicitly support end-user programmers' debugging by providing information to help them find bugs, there is little research addressing the proper mechanism to alert the user to this information. The choice of alerting mechanism can be important, because as previous research has shown, different interruption styles have different potential advantages and disadvantages. To explore impacts of interruptions in the end-user debugging domain, this paper describes an empirical comparison of two interruption styles that have been used to alert end-user programmers to debugging information. Our results show that negotiated-style interruptions were superior to immediate-style interruptions in several issues of importance to end-user debugging, and further suggest that a reason for this superiority may be that immediate-style interruptions encourage different debugging strategies.

References

  1. Bailey, B.P., Konstan, J.A., and Carlis, J.V. Measuring the effects of interruptions on task performance in the user interface. IEEE Proc. Conf. Systems, Man, and Cybernetics (2000), 757--762.Google ScholarGoogle ScholarCross RefCross Ref
  2. Burmistrov, I. and Leonova, A. Do interrupted users work faster or slower? The micro-analysis of computerized text editing task. Human-Computer Interaction: Theory and Practice (Part I) - Proc. HCI International 2003, Vol. 1. (J. Jacko and C. Stephanidis, eds.) Lawrence Erlbaum Associates, Mahwah, NJ, 2003, 621--625.Google ScholarGoogle Scholar
  3. Burnett, M., Atwood, J., Djang, R., Gottfried, H., Reichwein, J., and Yang, S. Forms/3: A first-order visual language to explore the boundaries of the spreadsheet paradigm. J. Functional Programming 11, 2 (2001), 155--206. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Burnett, M., Cook, C., Pendse, O., Rothermel, G., Summet, J., Wallace, C. End-user software engineering with assertions in the spreadsheet paradigm. Proc. 25th Int. Conf. Soft. Eng. (2003), 93--103. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Carroll, J. The Nurnberg Funnel: Designing Minimalist Instruction for Practical Computer Skill. MIT Press, Cambridge, MA, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Corbett, A. and Anderson, J. Locus of feedback control in computer-based tutoring: Impact on learning rate, achievement and attitudes. Proc. CHI 2001, 245--252. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Czerwinski, M., Cutrell, E., and Horvitz, E. Instant messaging: Effects of relevance and time. People and Computers XIV: Proc. HCI 2000, Vol. 2 (S. Turner and P. Turner, eds.), British Computer Society, 2000, 71--76.Google ScholarGoogle Scholar
  8. Fisher, M., Cao, M., Rothermel, G., Cook, C., and Burnett, M. Automated test generation for spreadsheets. Proc. 24th Int. Conf. Soft. Eng. (2002), 141--151. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Hess, S., Detweiler, M. Training to reduce the disruptive effects of interruptions. Proc. Human Factors and Ergonomics Society Annual Mtg. (1994), 1173--1177.Google ScholarGoogle ScholarCross RefCross Ref
  10. Hudson, S., Fogarty, J., Atkeson, C., Avrahami, D., Forlizzi, J., Kiesler, S., Lee, J., and Yang, J. Predicting human interruptibility with sensors: A Wizard of Oz feasibility study. Proc. CHI 2003, 257--264. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Ko, A.J. and Myers, B.A. Development and evaluation of a model of programming errors. Proc. IEEE Symp. Human-Centric Computing Languages and Environments (2003), 7--14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Ko, A.J. and Myers, B.A. Designing the whyline: A debugging interface for asking questions about program failures. Proc. CHI 2004 (to appear). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Mathan, S. and Koedinger, K. Recasting the feedback debate: Benefits of tutoring error detecting and correction skill. Int. Conf. Artificial Intell. Education (2003). Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. McFarlane, D.C. Comparison of four primary methods for coordinating the interruption of people in human-computer interaction. Human-Computer Interaction 17, 1 (2002), 63--139. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Miller, R. and Myers B. Outlier finding: Focusing user attention on possible errors. Proc. User Interface Soft. and Technology (2001), ACM Press, 81--90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Pongched, P. A More Complex Model of Relevancy in Interruptions. Human-Computer Interaction Capstone, School of Computer Science, DePaul University, Chicago, IL (2003). http://www.spong.org/~pechluck/HCI/content-of-interruptions.pdfGoogle ScholarGoogle Scholar
  17. Ruthruff, J., Creswick, E., Burnett, M., Cook, C., Prabhakararao, S., Fisher II, M., and Main, M. End-user software visualizations for fault localization. Proc. ACM Symp. Soft. Visualization (2003), 123--132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Speier, C., Valacich, J., and Vessey, I. The effects of task interruption and information presentation on individual decision making. Proc. 18th Int. Conf. Information Systems (1997), 21--36. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Wagner, E. and Lieberman, H. An end-user tool for e-commerce debugging. Proc. Int. Conf. Intelligent User Interfaces (2003), 331--331. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Wilson, A., Burnett, M., Beckwith, L., Granatir, O., Casburn, L., Cook, C., Durham, M., and Rothermel, G. Harnessing curiosity to increase correctness in end-user programming. Proc. CHI 2003, 305--312. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Impact of interruption style on end-user debugging

                  Recommendations

                  Comments

                  Login options

                  Check if you have access through your login credentials or your institution to get full access on this article.

                  Sign in
                  • Published in

                    cover image ACM Conferences
                    CHI '04: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
                    April 2004
                    742 pages
                    ISBN:1581137028
                    DOI:10.1145/985692

                    Copyright © 2004 ACM

                    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                    Publisher

                    Association for Computing Machinery

                    New York, NY, United States

                    Publication History

                    • Published: 25 April 2004

                    Permissions

                    Request permissions about this article.

                    Request Permissions

                    Check for updates

                    Qualifiers

                    • Article

                    Acceptance Rates

                    Overall Acceptance Rate6,199of26,314submissions,24%

                  PDF Format

                  View or Download as a PDF file.

                  PDF

                  eReader

                  View online with eReader.

                  eReader