This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Towards fast, view-invariant human action recognition
Anchorage, AK, USA
June 23-June 28
ISBN: 978-1-4244-2339-2
Srikanth Cherla, Siemens Corporate Technology, SISL - Bangalore, India
Kaustubh Kulkarni, Siemens Corporate Technology, SISL - Bangalore, India
Amit Kale, Siemens Corporate Technology, SISL - Bangalore, India
V. Ramasubramanian, Siemens Corporate Technology, SISL - Bangalore, India
In this paper, we propose a fast method to recognize human actions which accounts for intra-class variability in the way an action is performed. We propose the use of a low dimensional feature vector which consists of (a) the projections of the width profile of the actor on to an “action basis” and (b) simple spatio-temporal features. The action basis is built using eigenanalysis of walking sequences of different people. Given the limited amount of training data, Dynamic Time Warping (DTW) is used to perform recognition. We propose the use of the average-template with multiple features, first used in speech recognition, to better capture the intra-class variations for each action. We demonstrate the efficacy of this algorithm using our low dimensional feature to robustly recognize human actions. Furthermore, we show that view-invariant recognition can be performed by using a simple data fusion of two orthogonal views. For the actions that are still confusable, a temporal discriminative weighting scheme is used to distinguish between them. The effectiveness of our method is demonstrated by conducting experiments on the multi-view IXMAS dataset of persons performing various actions.
Citation:
Srikanth Cherla, Kaustubh Kulkarni, Amit Kale, V. Ramasubramanian, "Towards fast, view-invariant human action recognition," cvprw, pp.1-8, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008
Usage of this product signifies your acceptance of the Terms of Use.