This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers
October 2006 (vol. 28 no. 10)
pp. 1553-1567
In order to provide relevant information to mobile users, such as workers engaging in the manual tasks of maintenance and assembly, a wearable computer requires information about the user's specific activities. This work focuses on the recognition of activities that are characterized by a hand motion and an accompanying sound. Suitable activities can be found in assembly and maintenance work. Here, we provide an initial exploration into the problem domain of continuous activity recognition using on-body sensing. We use a mock "wood workshop” assembly task to ground our investigation. We describe a method for the continuous recognition of activities (sawing, hammering, filing, drilling, grinding, sanding, opening a drawer, tightening a vise, and turning a screwdriver) using microphones and three-axis accelerometers mounted at two positions on the user's arms. Potentially "interesting” activities are segmented from continuous streams of data using an analysis of the sound intensity detected at the two different locations. Activity classification is then performed on these detected segments using linear discriminant analysis (LDA) on the sound channel and hidden Markov models (HMMs) on the acceleration data. Four different methods at classifier fusion are compared for improving these classifications. Using user-dependent training, we obtain continuous average recall and precision rates (for positive activities) of 78 percent and 74 percent, respectively. Using user-independent training (leave-one-out across five users), we obtain recall rates of 66 percent and precision rates of 63 percent. In isolation, these activities were recognized with accuracies of 98 percent, 87 percent, and 95 percent for the user-dependent, user-independent, and user-adapted cases, respectively.

[1] S. Feiner, B. MacIntyre, and D. Seligmann, “Knowledge-Based Augmented Reality,” Comm. ACM, vol. 36, no. 7, pp. 52-62, 1993.
[2] M. Lampe, M. Strassner, and E. Fleisch, “A Ubiquitous Computing Environment for Aircraft Maintenance,” Proc. ACM Symp. Applied Computing, pp. 1586-1592, 2004.
[3] D. Abowd, A.K. Dey, R. Orr, and J. Brotherton, “Context-Awareness in Wearable and Ubiquitous Computing,” Virtual Reality, vol. 3, no. 3, pp. 200-211, 1998.
[4] T. Starner, B. Schiele, and A. Pentland, “Visual Contextual Awareness in Wearable Computing,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 50-57, 1998.
[5] C. Vogler and D. Metaxas, “ASL Recognition Based on a Coupling between HMMs and 3D Motion Analysis,” Proc. Int'l Conf. Computer Vision, 1998.
[6] A.D. Wilson and A.F. Bobick, “Learning Visual Behavior for Gesture Analysis,” Proc. IEEE Int'l Symp. Computer Vision, Nov. 1995.
[7] J. Schlenzig, E. Hunter, and R. Jain, “Recursive Identification of Gesture Inputs Using Hidden Markov Models,” Proc. Second Conf. Applications of Computer Vision, pp. 187-194, Dec. 1994.
[8] J.M. Rehg and T. Kanade, “Digiteyes: Vision-Based Human Hand Tracking,” technical report, Carnegie Mellon Univ., Dec. 1993.
[9] J.B. J. Bussmann, W.L.J. Martens, J.H.M. Tulen, F. Schasfoort, H.J.G. van den Berg-Emons, and H. Stam, “Measuring Daily Behavior Using Ambulatory Accelerometry: The Activity Monitor,” Behavior Research Methods, Instruments + Computers, vol. 33, no. 3, pp. 349-356, 2001.
[10] P. Bonato, “Advances in Wearable Technology and Applications in Physical and Medical Rehabilitation,” J. NeuroEng. and Rehabilitation, vol. 2, no. 2, 2005.
[11] K. Aminian and B. Najafi, “Capturing Human Motion Using Body-Fixed Sensors: Outdoor Measurement and Clinical Applications,” Computer Animation and Virtual Worlds, vol. 15, pp. 79-94, 2004.
[12] P.H. Veltink, H.B.J. Bussmann, W. deVries, W.L.J. Martens, and R.C. van Lummel, “Detection of Static and Dynamic Activities Using Uniaxial Accelerometers,” IEEE Trans. Rehabilitation Eng., vol. 4, no. 4, pp. 375-386, 1996.
[13] K. Aminian, P. Robert, E.E. Buchser, B. Rutschmann, D. Hayoz, and M. Depairon, “Physical Activity Monitoring Based on Accelerometry: Validation and Comparison with Video Observation,” Medical Biology Eng. Computers, vol. 37, pp. 304-308, 1999.
[14] M. Wetzler, J.R. Borderies, O. Bigaignon, P. Guillo, and P. Gosse, “Validation of a Two-Axis Accelerometer for Monitoring Patient Activity During Blood Pressure or ECG Holter Monitoring,” Clinical and Pathological Studies, 2003.
[15] M. Uiterwaal, E.B. Glerum, H.J. Busser, and R.C. van Lummel, “Ambulatory Monitoring of Physical Activity in Working Situations, a Validation Study,” J. Medical Eng. Technology, vol. 22, no. 4, pp. 168-172, 1998.
[16] J. Mantyjarvi, J. Himberg, and T. Seppanen, “Recognizing Human Motion with Multiple Acceleration Sensors,” Proc. IEEE Int'l Conf. Systems, Man, and Cybernetics, vol. 2, pp. 747-752, 2001.
[17] C. Randell and H. Muller, “Context Awareness by Analysing Accelerometer Data,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 175-176, 2000.
[18] K. Van-Laerhoven and O. Cakmakci, “What Shall We Teach Our Pants?” Proc. IEEE Int'l Symp. Wearable Computers, pp. 77-83, 2000.
[19] S. Antifakos, F. Michahelles, and B. Schiele, “Proactive Instructions for Furniture Assembly,” Proc. Fourth Int'l Conf. UbiComp, p. 351, 2002.
[20] G. Fang, W. Gao, and D. Zhao, “Large Vocabulary Sign Language Recognition Based on Hierarchical Decision Trees,” Proc. Int'l Conf. Multimodal Interfaces, Nov. 2003.
[21] V. Peltonen, J. Tuomi, A. Klapuri, J. Huopaniemi, and T. Sorsa, “Computational Auditory Scene Recognition,” Proc. IEEE Int'l Conf. Acoustics, Speech, and Signal Processing, vol. 2, pp. 1941-1944, May 2002.
[22] M.C. Büchler, “Algorithms for Sound Classification in Hearing Instruments,” PhD thesis, ETH Zurich, 2002.
[23] B. Clarkson, N. Sawhney, and A. Pentland, “Auditory Context Awareness in Wearable Computing,” Proc. Workshop Perceptual User Interfaces, Nov. 1998.
[24] H. Wu and M. Siegel, “Correlation of Accelerometer and Microphone Data in the Coin Tap Test,” IEEE Trans. Instrumentation and Measurements, vol. 49, pp. 493-497, June 2000.
[25] L. Xu, A. Kryzak, and C. Suen, “Methods of Combining Multiple Classifiers and Their Applications to Handwriting Recognition,” IEEE Trans. Systems, Man, and Cybernetics, vol. 22, pp. 418-435, May/June 1992.
[26] T.K. Ho, “Multiple Classifier Combination: Lessons and Next Steps,” Hybrid Methods in Pattern Recognition. World Scientific, 2002.
[27] J. Kittler, M. Hatef, R. Duin, and J. Matas, “On Combining Classifiers,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 3, pp. 226-239, Mar. 1998.
[28] T.K. Ho, J.J. Hull, and S.N. Srihari, “Decision Combination in Multiple Classifier Systems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 1, pp. 66-75, Jan. 1994.
[29] L. Bao and S. Intille, “Activity Recognition from User-Annotated Acceleration Data,” Pervasive, 2004.
[30] M. Stäger, P. Lukowicz, N. Perera, T. Büren, G. Tröster, and T. Starner, “Soundbutton: Design of a Low Power Wearable Audio Classification System,” Proc. IEEE Int'l Symp. Wearable Computers, 2003.
[31] R. Duda, P. Hart, and D. Stork, Pattern Classification, second ed. Wiley, 2001.
[32] C.V.C. Bouten, “A Triaxial Accelerometer and Portable Data Processing Unit for the Assessment of Daily Physical Activity,” IEEE Trans. Biomedical Eng., vol. 44, pp. 136-147, Mar. 1997.
[33] L. Rabiner and B. Juang, “An Introduction to Hidden Markov Models,” IEEE ASSP Magazine, pp. 4-16, Jan. 1986.
[34] T. Starner, J. Makhoul, R. Schwartz, and G. Chou, “Online Cursive Handwriting Recognition Using Speech Recognition Methods,” Proc. Int'l Conf. Acoustics, Speech, and Signal Processing, pp. 125-128, 1994.
[35] K. Murphy, “The hmm Toolbox for MATLAB,” http://www.ai.mit.edu/murphyk/software/hmm hmm.html, 1998.
[36] N. Kern, B. Schiele, H. Junker, P. Lukowicz, and G. Tröster, “Wearable Sensing to Annotate Meeting Recordings,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 186-193, Oct. 2002.
[37] T. Fawcett, ROC Graphs: Notes and Practical Considerations for Researchers. Kluwer, 2004.
[38] E. Tapia, S. Intille, and K. Larson, “Activity Recognition in the Home Using Simple and Ubiquitous Sensors,” Pervasive, pp. 158-175, 2004.
[39] F. Provost, T. Fawcett, and R. Kohavi, “The Case Against Accuracy Estimation for Comparing Induction Algorithms,” Proc. 15th Int'l Money Laundering Conf., 1998.
[40] I. Phillips and A. Chhabra, “Empirical Performance Evaluation of Graphics Recognition Systems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 21, no. 9, pp. 849-870, Sept. 1999.
[41] A. Hoover, G. Jean-Baptiste, X. Jiang, P. Flynn, H. Bunke, D. Goldof, K. Bowyer, D. Eggert, A. Fitzgibbon, and R. Fisher, “An Experimental Comparison of Range Image Segmentation Algorithms,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, no. 7, pp. 673-689, July 1996.
[42] C. van Rijsbergen, Information Retrieval, second ed. Dept. of Computer Science, Univ. of Glasgow, 1979.
[43] M. Stäger, P. Lukowicz, and G. Tröster, “Implementation and Evaluation of a Low-Power Sound-Based User Activity Recognition System,” Proc. IEEE Int'l Symp. Wearable Computers, 2004.
[44] G. Ogris, T. Stiefmeier, H. Junker, P. Lukowicz, and G. Tröster, “Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures,” Proc. IEEE Int'l Symp. Wearable Computers, 2005.
[45] O. Amft, H. Junker, and G. Tröster, “Detection of Eating and Drinking Arm Gestures Using Inertial Body-Worn Sensors,” Proc. IEEE Int'l Symp. Wearable Computers, Oct. 2005.
[46] H. Brashear, T. Starner, P. Lukowicz, and H. Junker, “Using Multiple Sensors for Mobile Sign Language Recognition,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 45-53, 2003.
[47] H. Junker, P. Lukowicz, and G. Tröster, “Continuous Recognition of Arm Activities with Body-Worn Inertial Sensors,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 188-189, 2004.
[48] J.A. Ward, P. Lukowicz, and G. Tröster, “Gesture Spotting Using Wrist Worn Microphone and 3-Axis Accelerometer,” Proc. Soc-Eusai '05 Conf., Oct. 2005.
[49] D. Minnen, I. Essa, and T. Starner, “Expectation Grammars: Leveraging High-Level Expectations for Activity Recognition,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, June 2003.

Index Terms:
Pervasive computing, wearable computers and body area networks, classifier evaluation, industry.
Citation:
Jamie A. Ward, Paul Lukowicz, Gerhard Tr?ster, Thad E. Starner, "Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1553-1567, Oct. 2006, doi:10.1109/TPAMI.2006.197
Usage of this product signifies your acceptance of the Terms of Use.