The Community for Technology Leaders
RSS Icon
Issue No.08 - August (2008 vol.20)
pp: 1082-1090
With the availability of affordable sensors and sensor networks, sensor-based human activity recognition has attracted much attention in artificial intelligence and ubiquitous computing. In this paper, we present a novel two-phase approach for detecting abnormal activities based on wireless sensors attached to a human body. Detecting abnormal activities is a particular important task in security monitoring and healthcare applications of sensor networks, among many others. Traditional approaches to this problem suffer from a high false positive rate, particularly when the collected sensor data are biased towards normal data while the abnormal events are rare. Therefore, there is a lack of training data for many traditional data mining methods to be applied. To solve this problem, our approach first employs a one-class support vector machine (SVM) that is trained on commonly available normal activities, which filters out the activities that have a very high probability of being normal. We then derive abnormal activity models from a general normal model via a kernel nonlinear regression (KNLR) to reduce false positive rate in an unsupervised manner. We show that our approach provides a good tradeoff between abnormality detection rate and false alarm rate, and allows abnormal activity models to be automatically derived without the need to explicitly label the abnormal training data, which are scarce. We demonstrate the effectiveness of our approach using real data collected from a sensor network that is deployed in a realistic setting.
Activity Recognition, Outlier Detection, Sensor Networks, Data Mining
Jie Yin, Qiang Yang, Jeffrey Junfeng Pan, "Sensor-Based Abnormal Human-Activity Detection", IEEE Transactions on Knowledge & Data Engineering, vol.20, no. 8, pp. 1082-1090, August 2008, doi:10.1109/TKDE.2007.1042
[1] S.D. Bay and M. Schwabacher, “Mining Distance-Based Outliers in Near Linear Time with Randomization and a Simple Pruning Rule,” Proc. Ninth ACM SIGKDD Int'l Conf. Knowledge Discovery and Data Mining (KDD '03), pp. 29-38, Aug. 2003.
[2] A.P. Bradley, “The Use of the Area under the ROC Curve in the Evaluation of Machine Learing Algorithms,” Pattern Recognition, vol. 30, pp. 1145-1159, 1997.
[3] M.M. Breunig, H.P. Kriegel, R. Ng, and J. Sander, “Identifying Density-Based Local Outliers,” Proc. ACM SIGMOD Int'l Conf. Management of Data (SIGMOD '00), pp. 93-104, May 2000.
[4] P. Chan and S. Stolfo, “Toward Scalable Learning with Non-Uniform Class and Cost Distributions,” Proc. Fourth Int'l Conf. Knowledge Discovery and Data Mining (KDD '98), pp. 164-168, Aug. 1998.
[5] P. Domingos, “Metacost: A General Method for Making Classifiers Cost-Sensitive,” Proc. Fifth Int'l Conf. Knowledge Discovery and Data Mining (KDD '99), pp. 155-164, Aug. 1999.
[6] T. Duong, H. Bui, D. Phung, and S. Venkatesh, “Activity Recognition and Abnormality Detection with the Switching Hidden Semi-Markov Model,” Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition (CVPR '05), pp. 838-845, June 2005.
[7] C. Elkan, “The Foundations of Cost-Sensitive Learning,” Proc. 17th Int'l Joint Conf. Articial Intelligence (IJCAI '01), pp. 973-978, Aug. 2001.
[8] G. Fumera and F. Roli, “Cost-Sensitive Learning in Support Vector Machines,” Proc. Workshop Machine Learning, Methods and Applications, held in the Context of the Eighth Meeting of the Italian Assoc. of Artificial Intelligence (AI*IA '02), Sept. 2002.
[9] P. Jarvis, T.F. Lunt, and K.L. Myers, “Identifying Terrorist Activity with AI Plan Recognition Technology,” Proc. 19th Nat'l Conf. Artificial Intelligence (AAAI '04), pp. 858-863, July 2004.
[10] U. Knoll, G. Nakhaeizadeh, and B. Tausend, “Cost-Sensitive Pruning of Decision Trees,” Proc. 18th European Conf. Machine Learning (ECML '94), pp. 383-386, Apr. 1994.
[11] M. Kukar and I. Kononenko, “Cost-Sensitive Learning with Neural Networks,” Proc. 13th European Conf. Artificial Intelligence (ECAI '98), pp. 445-449, Aug. 1998.
[12] A. Lazarevic, L. Ertöz, A. Ozgur, J. Srivastava, and V. Kumar, “A Comparative Study of Anomaly Detection Schemes in Network Intrusion Detection,” Proc. Third SIAM Int'l Conf. Data Mining (SDM '03), pp. 23-34, May 2003.
[13] C. Leggetter and P. Woodland, “Maximum Likelihood Linear Regression for Speaker Adaptation of Continuous Density Hidden Markov Models,” Comupter Speech and Language, vol. 9, pp. 171-185, Apr. 1995.
[14] J. Lester, T. Choudhury, N. Kern, G. Borriello, and B. Hannaford, “A Hybrid Discriminative/Generative Approach for Modeling Human Activities,” Proc. 19th Int'l Joint Conf. Articial Intelligence (IJCAI '05), pp. 766-772, July-Aug. 2005.
[15] L. Liao, D. Fox, and H. Kautz, “Learning and Inferring Transportation Routines,” Proc. 19th Nat'l Conf. Artificial Intelligence (AAAI '04), pp. 348-353, July 2004.
[16] C.X. Ling, V.S. Sheng, and Q. Yang, “Test Strategies for Cost-Sensitive Decision Trees,” IEEE Trans. Knowledge and Data Eng., vol. 18, no. 8, pp. 1055-1067, Aug. 2006.
[17] C.X. Ling, J. Huang, and H. Zhang, “AUC: A Statistically Consistent and More Discriminating Measure than Accuracy,” Proc. 18th Int'l Joint Conf. Artificial Intelligence (IJCAI '03), pp. 519-526, Aug. 2003.
[18] P. Lukowicz, J. Ward, H. Junker, M. Stäger, G. Tröster, A. Atrash, and T. Starner, “Recognizing Workshop Activity Using Body Worn Microphones and Accelerometers,” Proc. Second Int'l Conf. Pervasive Computing (Pervasive '04), pp. 18-32, Apr. 2004.
[19] J. Ma and S. Perkins, “Time-Series Novelty Detection Using One-Class Support Vector Machines,” Proc. Int'l Joint Conf. Neural Networks (IJCNN '03), pp. 1741-1745, July 2003.
[20] D.J. Patterson, L. Liao, L. Fox, and H. Kautz, “Inferring High-Level Behavior from Low-Level Sensors,” Proc. Fifth Int'l Conf. Ubiquitous Computing (UbiComp '03), pp. 73-89, Oct. 2003.
[21] M.E. Pollack, “Intelligent Technology for an Aging Population: The Use of AI to Assist Elders with Cognitive Impairment,” AIMagazine, vol. 26, no. 2, pp. 9-24, 2005.
[22] L.R. Rabiner, “A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition,” Proc. IEEE, vol. 77, no. 2, pp.257-286, 1989.
[23] B. Schölkopf, J. Platt, J. Shawe-Taylor, and A. Smola, “Estimating the Support of a High-Dimensional Distribution,” Neural Computation, vol. 13, no. 7, pp. 1443-1471, July 2001.
[24] D.M.J. Tax and R.P.W. Duin, “Support Vector Domain Description,” Pattern Recognition Letters, vol. 20, no. 1113, pp. 1191-1199, 1999.
[25] K.M. Ting, “A Comparative Study of Cost-Sensitive Boosting Algorithms,” Proc. 17th Int'l Conf. Machine Learning (ICML '00), pp.983-990, June-July 2000.
[26] I.W. Tsang, J.T. Kwok, B. Mak, K. Zhang, and J.J. Pan, “Fast Speaker Adaptation via Maximum Pernalized Likelihood Kernel Regression,” Proc. Int'l Conf. Acoustics, Speech and Signal Processing (ICASSP '06), May 2006.
[27] T. Xiang and S. Gong, “Video Behaviour Profiling and Abnormality Detection without Manual Labeling,” Proc. IEEE Int'l Conf. Computer Vision (ICCV '05), pp. 1238-1245, Oct. 2005.
[28] Q. Yang, C. Ling, X. Chai, and R. Pan, “Test-Cost Sensitive Classification on Data with Missing Values,” IEEE Trans. Knowledge and Data Eng., vol. 18, no. 5, pp. 626-638, May 2006.
[29] Q. Yang, J. Yin, C. Ling, and R. Pan, “Extracting Actionable Knowledge from Decision Trees,” IEEE Trans. Knowledge and Data Eng., vol. 19, no. 1, pp. 43-56, Jan. 2007.
[30] Y. Yao, F. Wang, J. Wang, and D.D. Zeng, “Rule $+$ Exception Strategies for Security Information Analysis,” IEEE Intelligent Systems, vol. 20, no. 5, pp. 52-57, Sept./Oct. 2005.
[31] J. Yin, X. Chai, and Q. Yang, “High-Level Goal Recognition in a Wireless LAN,” Proc. 19th Nat'l Conf. in Artificial Intelligence (AAAI '04), pp. 578-584, July 2004.
39 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool