Total views : 250

Understanding Deformable-Gestures Using Paper Prototyping With Children

Affiliations

  • Department of Multimedia, Faculty of Computer Science and IT, University Putra Malaysia, UPM Serdang, Malaysia
  • Department of Computer Science, College of Computer and Information Sciences, Prince Sultan University, Riyadh, Saudi Arabia

Abstract


Background/Objectives: This preliminary study presents the process and outputs from a paper prototyping activity aimed at understanding deformation-based gestures amongst children aged 5-6 years old. Methods/Statistical Analysis: This is done by observing them interacting with an artificial deformable object i.e. paper. As a result, we obtained childdefined deformable-gestures that will help with the design and implementation of Organic User Interfaces (OUI) in the future, without considering the current technical level and challenges. A flexible or deformable interface is a type of OUI where when flexible displays are deployed; shape deformation like bending is a key form of input. Findings: Upon completing this preliminary study, it is found that children are very much influenced by existing methods like swiping on the screen, pressing on physical buttons and others as compared to using gestures. The results show that the most common gestures are bending upwards and downwards and unfolding. However, there is difference in preference on using paper in real life in portrait format as in this paper it is found that all the children have held their “device” in landscape format. Although the definition and contribution of deformable display is still in its infancy, the use in this study of children of a preschool age has shown that even at an early stage in their development the children’s actions were intuitive. Application/ Improvements: To use a more interactive interface and to include more variables to provide a better of the use of the displays in the future. Lastly is to include a larger sample size.

Keywords

Children, Deformable, Organic User Interface, Paper Prototyping.

Full Text:

 |  (PDF views: 189)

References


  • Vertegaal R, Poupyrev I. Organic User Interfaces. Communicationsof the ACM. 2008 Jun; 51(6):48–55.
  • Anthony L, Brown Q, Nias J, Tate B, Mohan S. Interaction and recognition challenges in interpreting children’s touch and gesture input on mobile devices. Proceedings of the 7th ACM International Conference on Interactive Tabletops and Surfaces - ITS '12; 2012 Nov. p. 225–34.
  • Kendon A. How gestures can become words. Crosscultural Perspectives in Nonverbal Communication; 1998. p. 131–41.
  • Ahmaniemi TT, Kildal J, Haveri M. What is a device bend gesture really good for? Proceedings of 32nd annual ACM CHI conference on Human factors in Computing Systems - CHI ’14; 2014 Apr. p. 3503–12.
  • Alexander J, Brotman R, Holman D, Younkin A, Vertegaal R, Kildal J, Lucero AA, Roudaut A, Subramanian S. Organic experiences: (Re)Shaping interactions with deformable displays. Proceedings of CHI ’13 Extended Abstracts on Human Factors in Computing Systems - CHI EA '13; 2013 May. p. 3171–4.
  • Schwesig C, Poupyrev I, Mori E. Gummi: A bendable computer. Proceedings of 22nd annual ACM CHI Conference on Human Factors in Computing Systems - CHI ’04; 2004 Apr. p. 263–70.
  • Gallant TD, Seniuk GA, Vertegaal R. Towards more paper-like input: Flexible input devices for foldable interaction styles. Proceedings of 21st Annual ACM Symposium on User Interface Software and Technology - UIST ’08; 2008 Oct. p. 283–6.
  • Holman D, Vertegaal R, Altosaar M, Troje N, Johns D. Paper Windows: Interaction techniques for digital paper. Proceedings of 23rd annual ACM CHI Conference on Human Factors in Computing Systems - CHI ’05; 2005 Apr. p. 591–9.
  • Lahey B, Girouard A, Burleson W, Vertegaal R. PaperPhone: Understanding the use of bend gestures in mobile devices with flexible electronic paper display. Proceedings of 29th annual ACM CHI Conference on Human Factors in Computing Systems - CHI ’11; 2011 Mar. p. 1303–12.
  • Herkenrath G, Karrer T, Borchers J. Twend: Twisting and bending as new interaction gesture in mobile devices. Proceedings of CHI ’08 Extended Abstracts on Human Factors in Computing Systems - CHI '08; 2008 Apr. p. 3819–24.
  • Watanabe J, Mochizuki A, Horry Y. Bookisheet: Bendable device for browsing content using the metaphor of leafing through the pages. Proceedings of the 10th International Conference on Ubiquitous Computing - Ubicomp '08; 2008 Sep. p. 360–9.
  • Wobbrock JO, Morris MR, Wilson AD. User-defined gestures for surface computing. Proceedings of 27th Annual ACM CHI Conference on Human Factors in Computing Systems - CHI ’09; 2009 Apr. p. 1083–92.
  • Wang X, Zhang X, Dai G. Tracking of deformable human hand in real time as continuous input for gesture-based interaction. Proceedings of the 12th International Conference on Intelligent User Interfaces - IUI '07; 2007 Jan. p. 235–42.
  • Kim SS, Kim S Jin B, Choi E, Kim B, Jia X, Kim D, Lee KP. How users manipulate deformable displays as input devices. Proceedings of 28th annual ACM CHI Conference on Human Factors in Computing Systems – CHI ‘10; 2010 Apr. p. 1647–56.
  • Tajika T, Yonezawa T, Mitsunaga N. Intuitive page-turning interface of e-books on flexible e-paper based on user studies. Proceedings of the 16th ACM International Conference on Multimedia - MM '08; 2008 Oct. p. 793–6.
  • Connell S, Kuo PY, Liu L, Piper AM. A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. Proceedings of the 12th International Conference on Interaction Design and Children - IDC '13; 2013 Jun. p. 277–80.
  • Lee YH, Ahn H, Cho HJ, Lee JH . Recognition of facial emotion through face analysis based on quadratic bezier curves. Indian Journal of Science and Technology. 2015 Dec; 8(35):1–9.
  • Vaucelle C, Ishii H. Picture This!: Film assembly using toy gestures. Proceedings of the 10th International Conference on Ubiquitous Computing - Ubicomp '08; 2008 Sep. p. 350–9.

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.