ABSTRACT
Vision-based head trackers have been around for some years and are even beginning to be commercialized, but problems remain with respect to usability. Users without the ability to use traditional pointing devices - the intended audience of such systems - have no alternative if the automatic boot strapping process fails, there is room for improvement in face tracking, and the pointer movement dynamics do not support accurate and efficient pointing. This paper describes a novel head tracking pointer that addresses these problems.
- Betke, M., Gips, J. and Fleming, P. The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access For People with Severe Disablilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering. April 2002.Google ScholarCross Ref
- Fast B. N. Waber, J. J. Magee, and M. Betke. Fast Head Tilt Detection for Human-Computer Interaction. Proceedings of the ICCV Workshop on Human Computer Interaction., Beijng, China, October 2005. Springer Verlag. Google ScholarDigital Library
- Gorodnichy, D., Malik, S., and Roth, G. Nouse 'Use your nose as a joystick or a mouse' - a new technology for handsfree games and interfaces. in Proc. Intern. Conf. on Vision Interface (VI'2002), Calgary, May 2002.Google Scholar
- Hanson, V. L., Brezin, J., Crayne, S., Keates, S., Kjeldsen, R., Richards, J. T., Swart, C.,Trewin, S. accessibilityWorks: Web Access for an Open Source Browser, IBM Systems Journal Vol. 44, No. 3 Google ScholarDigital Library
- Ballard, D.H., Generalizing the Hough transform to detect arbitrary shapes, Pattern Recognition, Vol. 13, Issue 2, 1981, pg. 111--122Google ScholarCross Ref
- Kjeldsen, F. , Visual Recognition of Hand Gesture as a Practical Interface Modality, PhD thesis,Department of Computer Science, Columbia University, 1997 Google ScholarDigital Library
- Kohler, M., Using the Kalman Filter to Track Human Interactive Motion - Modeling an Initialization of the Kalman Filter for Translational Motion. Technical Report 629, Informatik VII, University of Dortmund, January 1997.Google Scholar
- MacKenzie, I.S., and Ware, C., Lag as a Determinant of Human Performance in Interactive Systems, in Proceedings of INTERCHI '93, Amsterdam, April 1993. Google ScholarDigital Library
- Matsumoto, Y., Ogasawara, T., and Zelinsky, A., Behavior Recognition Based on Head Pose and Gaze Direction Measurement, in Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and SystemsGoogle ScholarCross Ref
- Meyer, D. E., Smith, K. J. E., Kornblum, S., Abrams, R. A., Wright, C. E., Speed-Accuracy Tradeoffs in Aimed Movements: Toward a Theory of Rapid Voluntary Action. in Attention and Performance XIII, M. Jeanerod, Ed. Lawrence Erlbaum Associates, Hillsdale, New Jersey, 1990 pp. 173--226.Google Scholar
- Proceedings of the 7th IEEE International Conference on Automatic Face and Gesture Recognition (FGR 2006), April 2006, Southampton, United Kingdom.Google Scholar
- Toyama, K. 'Look, Ma - no hands!' Hands-free cursor control with real-time 3D face tracking. in Proceedings of the Workshop on Perceptual User Interfaces (PUI'98), San Fransisco, November 1998.Google Scholar
Index Terms
- Improvements in vision-based pointer control
Recommendations
Design issues for vision-based computer interaction systems
PUI '01: Proceedings of the 2001 workshop on Perceptive user interfacesComputer Vision and other direct sensing technologies have progressed to the point where we can detect many aspects of a user's activity reliably and in real time. Simply recognizing the activity is not enough, however. If perceptual interaction is ...
The University of Alberta user interface management system
In this paper the design and implementation of the University of Alberta user interface management system (UIMS) is discussed. This UIMS is based on the Seeheim model of user interfaces, which divides the user interface into three separate components. ...
Leap motion gesture based interface for learning environment by using leap motion
HCIK '15: Proceedings of HCI KoreaNUI(Natural User Interface) which means a natural user manipulation environment uses body as an input device by using a sensor, without using an input device such as a mouse or a keyboard. For these reasons, it has a feature that can be used easily as ...
Comments