Book contents
- Frontmatter
- Contents
- Foreword: Out of Sight, Out of Mind
- Preface
- Part one New Interfaces and Novel Applications
- Part two Tracking Human Action
- Part three Gesture Recognition and Interpretation
- 11 A Framework for Gesture Generation and Interpretation
- 12 Model-Based Interpretation of Faces and Hand Gestures
- 13 Recognition of Hand Signs from Complex Backgrounds
- 14 Probabilistic Models of Verbal and Body Gestures
- 15 Looking at Human Gestures
- Acknowledgements
- Bibliography
- List of contributors
13 - Recognition of Hand Signs from Complex Backgrounds
from Part three - Gesture Recognition and Interpretation
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Foreword: Out of Sight, Out of Mind
- Preface
- Part one New Interfaces and Novel Applications
- Part two Tracking Human Action
- Part three Gesture Recognition and Interpretation
- 11 A Framework for Gesture Generation and Interpretation
- 12 Model-Based Interpretation of Faces and Hand Gestures
- 13 Recognition of Hand Signs from Complex Backgrounds
- 14 Probabilistic Models of Verbal and Body Gestures
- 15 Looking at Human Gestures
- Acknowledgements
- Bibliography
- List of contributors
Summary
Abstract
In this chapter, we present our approach to recognizing hand signs. It addresses three key aspects of the hand sign interpretation, the hand location, the hand shape, and the hand movement. The approach has two major components: (a) a prediction-and-verification segmentation scheme to segment the moving hand from its background; (b) a recognizer that recognizes the hand sign from the temporal sequence of segmented hand together with its global motion information. The segmentation scheme can deal with a large number of different hand shapes against complex backgrounds. In the recognition part, we use multiclass, multi-dimensional discriminant analysis in every internal node of a recursive partition tree to automatically select the most discriminating linear features for gesture classification. The method has been tested to recognize 28 classes of hand signs. The experimental results show that the system can achieve a 93% recognition rate for test sequences that have not been used in the training phase.
Introduction
The ability to interpret the hand gestures is essential if computer systems can interact with human users in a natural way. In this chapter, we present a new vision-based framework which allows the computer to interact with users through hand signs.
Since its first known dictionary was printed in 1856 [61], American Sign Language (ASL) is widely used in the deaf community as well as by the handicapped people who are not deaf [49].
- Type
- Chapter
- Information
- Computer Vision for Human-Machine Interaction , pp. 235 - 266Publisher: Cambridge University PressPrint publication year: 1998