Affordable Access

Publisher Website

Online gesture recognition from pose kernel learning and decision forests

Pattern Recognition Letters
DOI: 10.1016/j.patrec.2013.10.005
  • Online Gesture Recognition
  • Key Pose Identification
  • Skeleton Representation
  • Depth Sensors
  • 3D Motion
  • Natural User Interface


Abstract The recent popularization of real time depth sensors has diversified the potential applications of online gesture recognition to end-user natural user interface (NUI). This requires significant robustness of the gesture recognition to cope with the noisy data from the popular depth sensor, while the quality of the final NUI heavily depends on the recognition execution speed. This work introduces a method for real-time gesture recognition from a noisy skeleton stream, such as those extracted from Kinect depth sensors. Each pose is described using an angular representation of the skeleton joints. Those descriptors serve to identify key poses through a Support Vector Machine multi-class classifier, with a tailored pose kernel. The gesture is labeled on-the-fly from the key pose sequence with a decision forest, which naturally performs the gesture time control/warping and avoids the requirement for an initial or neutral pose. The proposed method runs in real time and its robustness is evaluated in several experiments.

There are no comments yet on this publication. Be the first to share your thoughts.