Real-time Robust Detection and Extraction of Hand Gestures for HCI

Y. Zhu, K. Palaniappan, Y. Zhao, X. Zhuang (USA), and G. Xu (PRC)



Hand gestures offer a uniquely convenient approach to improve human-computer interfaces (HCI). More efficient and accurate inputs for interacting with 3D visualization systems are needed since keyboard, menu and pointer interfaces are a cumbersome method for defining 3D orientations and transformations. Hand gestures are an easily learned, almost universal, language independent method for determining 3D interactions. Robust algorithms are necessary for detecting and recognizing hand gestures in real-time using video cameras in order to implement gesture-based interfaces. A new hierarchical spatio temporal approach exploiting multiple cues (skin color, hand motion, shape) is proposed for hand gesture recognition. A fusion of skin detectors combined with coarse image motion detectors is employed in the first stage. In the second stage spatio-temporal signatures of hand gestures are efficiently extracted by using robust parameterized image motion regression in conjunction with hand shape analysis. A modified dynamic time warping (MDTW) algorithm is used to account for the time variation of the spatio-temporal appearance patterns due to various gesturing rates. The incorporation of multiple cues provides robustness and the multi-stage hierarchical approach enables real-time performance under normal illumination conditions. Results of experiments using video streams with hand gestures are discussed and can be applied to advanced HCI for biomedicine.

Important Links:

Go Back