Total views : 169

Communication Board and Visual PerceptionTraining Contents with Gaze Tracking for Augmentative and Alternative Communication(AAC)


  • Department of Computer Engineering, Korea Polytechnic University, Siheung, Korea
  • Remed Co., Korea


This study proposed an gaze tracking system which detects the pupil and the center point of the glints from near-infrared camera images and, in order to use this system, implemented visual perception training contents and augmentative and alternative communication software. The center of the pupil and the center of two glint points of the eye images used for gaze tracking were extracted by using a model with four kinds of simple features. After using the calculated distance between the two centers to locate the relative position of the eye-gaze on the monitor, the x and y coordinates of the screen were mapped to match the gaze and the mouse pointer. The visual perception training contents for using the developed gaze tracking system consists of such components as visual acuity, eye movement speed, visual reaction rate, and visual concentration. Furthermore, software was implemented to enable an alternative means of communication to be used by symbols which convey the meaning of vocabulary being used in spoken language.


Augmentative and Alternative Communication, Disabled, Gaze Tracking, Visual Perception.

Full Text:

 |  (PDF views: 157)


  • Organization for Economic Cooperation and Development (OECD), Health Data 2013. 2014.
  • Koppenhaver DA, Steelman JD, Pierce PL, Yoder DE and Staples A. Developing Augmentative and Alternative Communication Technology in Order to Develop Literacy. Technology and Disability. 1993; 2(3):32-41.
  • Mu CS and Ming TC. Voice-controlled human-computer interface for the disabled. Computing & Control Engineering Journal. 2001 October; 12(5):225-30.
  • Gao X, Xu D, Cheng M and Gao S. A BCI-based environmental controller for the motion-disabled. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2003 June; 11(2):137-40.
  • Xiong A, Chen Y, Zhao X, Han J and Liu G. A novel HCI based on EMG and IMU. International Conference on Robotics and Biomimetics. 2011 December; p. 2653–57.
  • Moon HJ and Kim SI. Touch Button Design for Smartphone Considering People with Upper Limb Disabilities. Korean Society for Rehabilitation of Persons Disabilies. 2015; 19(2):181-203.
  • Zelinsky A and Heinzmann J. Real-time visual recognition of facial gestures for human-computer interaction. Proceedings of the Second International Conference on Automatic Face and Gesture Recognition. 1996; p. 351-56.
  • Park JW, Kwon YM and Sohn KH. Gaze Tracking System Using Feature Points of Pupil and Glints Center. The Korean Society of Broad Engineers. 2006 March; 11(1):80-90.
  • Shih SW and Liu J. A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 2004 February; 34(1):234-45.
  • Santis AD and Iacoviello D. Robust real time eye tracking for computer interface for disabled people. Computer Methods and Programs in Biomedicine. 2009 October; 96(1):1-11.
  • Hutchinson TE, White KP, Martin WN, Reichert KC and Frey LA. Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics. 1989 November; 19(6):1527-34.
  • Illes J, Metter EJ, Hanson WR and Iritani S. Language production in Parkinson’s disease: Acoustic and linguistic considerations. Brain and Language. 1988 January; 33(1):146-60.
  • Yorkston K, Honsinger M, Dowden P and Marriner N. Vocabulary selection: A case report. Augmentative and Alternative Communication. 1989; 5(2):101-8.
  • Wu YC and Voda JA. User-friendly communication board for nonverbal, severely physically disabled individuals. Archives of Physical Medicine and Rehabilitation. 1985; 66(12):827-28.
  • Cornsweet TN and Crane HD. Accurate two-dimensional eye tracker using first and fourth Purkinje images. Journal of the Optical Society of America. 1973; 63(8):921-28.
  • Viola P and Jones MJ. Robust Real-Time Face Detection. International Journal of Computer Vision. vol. 57, issue 2, pp. 137-154.
  • Ioannou D, Huda W and Laine AF. Circle recognition through a 2D Hough transform and radius histogramming. Image and Vision Computing. 1999 January; 17(1):15-26.
  • Herdman SJ, Tusa RJ, Blatt P, Suzuki A, Venuto PJ and Roberts D. Computerized dynamic visual acuity test in the assessment of vestibular deficits. American Journal of Otology. 1998 November; 19(6):790-96.
  • Williams HG and Helfrich J. Saccadic eye movement speed and motor response execution. Research Quarterly, American Alliance for Health, Physical Education and Recreation. 1977; 48(3):598-605.
  • Tuch DS, Salat DH, Wisco JJ, Zaleta AK, Hevelone ND and Rosas HD. Choice reaction time performance correlates with diffusion anisotropy in white matter pathways supporting visuospatial attention. Proceedings of the National Academy of Sciences of the United States of America. 2005 August; 102(34):12212-17.
  • Harris DV and Harris BL. The athlete’s guide to sports psychology: Mental skills for physical people. Human Kinetics. 1984; 1.
  • Dolch EW. Garrard Press: Problems in reading. 1948.


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.