Computer Science
Tracking, analysis, and recognition of human gestures in video
Document Type
Conference Paper
Abstract
An overview of research in automated gesture spotting, tracking and recognition by the Image and Video Computing Group at Boston University is given. Approaches for localization and tracking human hands in video, estimation of hand shape and upper body pose, tracking head and facial motion, as well as efficient spotting and recognition of specific gestures in video streams are summarized. Methods for efficient dimensionality reduction of gesture time series, boosting of classifiers for nearest neighbor search in pose space, and model-based pruning of gesture alignment hypotheses are described. Algorithms are demonstrated in three domains: American Sign Language, hand signals like those employed by flight-directors on airport runways, and gesture-based interfaces for severely disabled users. The methods described are general and can be applied in other domains that require efficient detection and analysis of patterns in time-series, images or video. © 2005 IEEE.
Publication Title
Proceedings of the International Conference on Document Analysis and Recognition, ICDAR
Publication Date
2005
Volume
2005
First Page
806
Last Page
810
ISSN
1520-5363
ISBN
9780769524207
DOI
10.1109/ICDAR.2005.243
Repository Citation
Sclaroff, Stan; Betke, Margrit; Kollios, George; Alon, Jonathan; Athitsos, Vassilis; Li, Rui; Magee, John; and Tian, Tai Peng, "Tracking, analysis, and recognition of human gestures in video" (2005). Computer Science. 49.
https://commons.clarku.edu/faculty_computer_sciences/49
APA Citation
Sclaroff, S., Betke, M., Kollios, G., Alon, J., Athitsos, V., Li, R., ... & Tian, T. P. (2005, August). Tracking, analysis, and recognition of human gestures in video. In Eighth International Conference on Document Analysis and Recognition (ICDAR'05) (pp. 806-810). IEEE.