The Community for Technology Leaders
Green Image
Issue No. 04 - October-December (2017 vol. 16)
ISSN: 1536-1268
pp: 62-74
Yonatan Vaizman , University of California, San Diego
Katherine Ellis , University of California, San Diego
Gert Lanckriet , University of California, San Diego
ABSTRACT
The ability to automatically recognize a person's behavioral context can contribute to health monitoring, aging care, and many other domains. Validating context recognition in the wild is crucial to promote practical applications that work in real-life settings. The authors collected more than 300,000 minutes of sensor data with context labels from 60 subjects. Unlike previous studies, these subjects used their own personal phone, in any way that was convenient to them, and engaged in their routine in their natural environments. Unscripted behavior and unconstrained phone usage resulted in situations that were harder to recognize. The authors demonstrate how fusion of multimodal sensors is important for resolving such cases. They present a baseline system and encourage researchers to use their public dataset to compare methods and improve context recognition in the wild.
INDEX TERMS
Machine learning, Smart phones, Data collection, Pervasive computing, Biomedical monitoring, Performance evaluation, Bioinformatics, Intelligent systems, Context awareness, Mobile sensors
CITATION

Y. Vaizman, K. Ellis and G. Lanckriet, "Recognizing Detailed Human Context in the Wild from Smartphones and Smartwatches," in IEEE Pervasive Computing, vol. 16, no. 4, pp. 62-74, 2017.
doi:10.1109/MPRV.2017.3971131
184 ms
(Ver 3.3 (11022016))