The Community for Technology Leaders
2018 IEEE International Conference on Pervasive Computing and Communications (PerCom) (2018)
Athens, Greece
March 19, 2018 to March 23, 2018
ISSN: 2474-249X
ISBN: 978-1-5386-3225-3
pp: 1-10
Jindong Wang , Beijing Key Laboratory of Mobile Computing and Pervasive Device Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China
Yiqiang Chen , Beijing Key Laboratory of Mobile Computing and Pervasive Device Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China
Lisha Hu , Institute of Information Technology, Hebei University of Economics and Business, Shijiazhuang, China
Xiaohui Peng , Beijing Key Laboratory of Mobile Computing and Pervasive Device Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China
Philip S. Yu , Department of Computer Science, University of Illinois at Chicago, IL, USA
ABSTRACT
In activity recognition, it is often expensive and time-consuming to acquire sufficient activity labels. To solve this problem, transfer learning leverages the labeled samples from the source domain to annotate the target domain which has few or none labels. Existing approaches typically consider learning a global domain shift while ignoring the intra-affinity between classes, which will hinder the performance of the algorithms. In this paper, we propose a novel and general cross-domain learning framework that can exploit the intra-affinity of classes to perform intra-class knowledge transfer. The proposed framework, referred to as Stratified Transfer Learning (STL), can dramatically improve the classification accuracy for cross-domain activity recognition. Specifically, STL first obtains pseudo labels for the target domain via majority voting technique. Then, it performs intra-class knowledge transfer iteratively to transform both domains into the same subspaces. Finally, the labels of target domain are obtained via the second annotation. To evaluate the performance of STL, we conduct comprehensive experiments on three large public activity recognition datasets (i.e., OPPORTUNITY, PAMAP2, and UCI DSADS), which demonstrates that STL significantly outperforms other state-of-the-art methods w.r.t. classification accuracy (improvement of 7.68%). Furthermore, we extensively investigate the performance of STL across different degrees of similarities and activity levels between domains. And we also discuss the potential of STL in other pervasive computing applications to provide empirical experience for future research.
INDEX TERMS
CITATION

J. Wang, Y. Chen, L. Hu, X. Peng and P. S. Yu, "Stratified Transfer Learning for Cross-domain Activity Recognition," 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom)(PERCOM), Athens, Greece, 2018, pp. 1-10.
doi:10.1109/PERCOM.2018.8444572
157 ms
(Ver 3.3 (11022016))