Hand tracking for behaviour understanding

作者:

Highlights:

摘要

A real-time computer vision system is described for tracking hands thus enabling behavioural events to be interpreted. Forearms are tracked to provide structural context, enabling mutual occlusion, which occurs when hands cross one another, to be handled robustly. No prior skin colour models are used. Instead adaptive appearance models are learned on-line. A contour distance transform is used to control model adaptation and to fit 2D geometric models robustly. Hands can be tracked whether clothed or unclothed. Results are given for a ‘smart desk’ and an in-vehicle application. The ability to interpret behavioural events of interest when tracking a vehicle driver's hands is described.

论文关键词:Hand tracking,Gesture recognition,Behavioural events,Intelligent vehicles,Smart desks

论文评审过程:Available online 16 September 2002.

论文官网地址:https://doi.org/10.1016/S0262-8856(02)00093-8