Tailor: Creating Custom User Interfaces Based on GestureReport
Physical controls for most devices are either "one size fits all" or require custom hardware for each user. Cost often prohibits custom design, and each user must adapt to the standard device interface, typically with a loss of precision and efficiency. When user abilities vary widely, such as in the disabled community, devices often become unusable. Our goal is to create a system that will track user gestures and interpret them as control signals for devices. Existing gesture recognition research converts continuous body motion into discrete symbols. Our approach is to map continuous motions into a set of analog device control signals. Our system will allow us to quickly tailor a device interface to each best physical range of motion. Our first application domain is a speech synthesizer for disabled users. We expect two major areas of applicability for non-disabled users: in telemanipulator interfaces, and as a design tool for creating biomechanically efficient interfaces. Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage, the ACM copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Association for Computing Machinery. To copy otherwise, or to republish, requires a fee and/or specific permission.
Note: Abstract extracted from PDF text
All rights reserved (no additional license for public reuse)
Pausch, Randy, and Ronald Williams. "Tailor: Creating Custom User Interfaces Based on Gesture." University of Virginia Dept. of Computer Science Tech Report (1990).
University of Virginia, Department of Computer Science