skip to main content
10.1145/2702613.2732878acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Work in Progress

TellTale: Adding a Polygraph to Everyday Life

Published:18 April 2015Publication History

ABSTRACT

TellTale is a wearable device that seeks to augment communication with subconscious emotion information. By sensing a user's heart rate and galvanic response, two major biological indicators of physiological state, TellTale can provide insight into true physiological and emotional response. In this way, TellTale acts as a playful, wearable polygraph or lie-detector. Through abstracted visualisations of the physiological data, we aim to position TellTale in-line with the learned skills of communication. In this paper, we motivate the design of TellTale, detail a prototype device and pilot study and present future areas for TellTale's exploration.

References

  1. Anttonen, J. and Surakka, V. Emotions and Heart Rate While Sitting on a Chair. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2005), 491--499. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Baveye, Y., Bettinelli, J.-N., Dellandréa, E., Chen, L., and Chamaret, C. A Large Video Data Base for Computational Models of Induced Emotion. Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, IEEE Computer Society (2013), 13--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. DePaulo, B.M., Lindsay, J.J., Malone, B.E., Muhlenbruck, L., Charlton, K., and Cooper, H. Cues to deception. Psychological Bulletin 129, 1 (2003), 74--118.Google ScholarGoogle ScholarCross RefCross Ref
  4. Gaver, W.W., Bowers, J., Boehner, K., et al. Indoor Weather Stations: Investigating a Ludic Approach to Environmental HCI Through Batch Prototyping. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2013), 3451--3460. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Goodwin, C. Notes on story structure and the organization of participation. Structures of social action, (1984), 225--246.Google ScholarGoogle Scholar
  6. Hogenboom, A., Bal, D., Frasincar, F., Bal, M., de Jong, F., and Kaymak, U. Exploiting Emoticons in Sentiment Analysis. Proceedings of the 28th Annual ACM Symposium on Applied Computing, ACM (2013), 703--710. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Iverson, J.M. and Goldin-Meadow, S. Why people gesture when they speak. Nature 396, 6708 (1998), 228--228.Google ScholarGoogle ScholarCross RefCross Ref
  8. Jacobs, M. and Worbin, L. Reach: Dynamic Textile Patterns for Communication and Social Expression. CHI '05 Extended Abstracts on Human Factors in Computing Systems, ACM (2005), 1493--1496. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Lisetti, C.L. etitia and Nasoz, F. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals. EURASIP J. Appl. Signal Process. 2004, (2004), 1672--1687. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. McAtamney, G. and Parker, C. An Examination of the Effects of a Wearable Display on Informal Faceto-face Communication. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2006), 45--54. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Mehrabian, A. Nonverbal Communication. Transaction Publishers, 1977.Google ScholarGoogle Scholar
  12. Schegloff, E.A. On some gestures' relation to talk. Structures of social action: Studies in conversation analysis, (1984), 266--296.Google ScholarGoogle Scholar
  13. Takano, Y. and Suzuki, K. Affective Communication Aid Using Wearable Devices Based on Biosignals. Proceedings of the 2014 Conference on Interaction Design and Children, ACM (2014), 213--216. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Tobita, H. and Kuzi, T. SmartWig: Wig-based Wearable Computing Device for Communication and Entertainment. Proceedings of the International Working Conference on Advanced Visual Interfaces, ACM (2012), 299--302. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Williams, L.M., Phillips, M.L., Brammer, M.J., et al. Arousal Dissociates Amygdala and Hippocampal Fear Responses: Evidence from Simultaneous fMRI and Skin Conductance Recording. NeuroImage 14, 5 (2001), 1070--1079.Google ScholarGoogle ScholarCross RefCross Ref
  16. Wittgenstein, L. Philosophical Investigations. John Wiley & Sons, 2010.Google ScholarGoogle Scholar
  17. Yuasa, M., Saito, K., and Mukawa, N. Emoticons Convey Emotions Without Cognition of Faces: An fMRI Study. CHI '06 Extended Abstracts on Human Factors in Computing Systems, ACM (2006), 1565--1570. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. TellTale: Adding a Polygraph to Everyday Life

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems
      April 2015
      2546 pages
      ISBN:9781450331463
      DOI:10.1145/2702613

      Copyright © 2015 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 April 2015

      Check for updates

      Qualifiers

      • Work in Progress

      Acceptance Rates

      CHI EA '15 Paper Acceptance Rate379of1,520submissions,25%Overall Acceptance Rate6,164of23,696submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader