The Community for Technology Leaders
2017 3rd International Conference on Big Data Computing and Communications (BIGCOM) (2017)
Chengdu, Sichuan, China
Aug. 10, 2017 to Aug. 11, 2017
ISBN: 978-1-5386-3349-6
pp: 226-231
Rapid increase in connectivity of physical sensors and Internet of Things (IoT) systems is enabling large-scale collection of time series data, and the data represents the working patterns and internal evolutions of observed objects. Recognizing and forecasting the underlying high-level states from raw sensory data are useful for daily activity recognition of humans and predictive maintenance of machines. Deep Learning (DL) methods have been proved efficient in computer vision, natural language processing, and speech recognition, and these model are also applied to time series analysis. Since time series are multi-dimensional and sequential with long-term temporal dependency, current DL-based model could not well learn the spatial and temporal features inside and between states, thus there is still plenty of room for improvement of recognizing and predicting high-level states. In this paper, a hybrid deep architecture named Long-term Recurrent Convolutional LSTM Network (LR-ConvLSTM) is proposed. The model is composed of Convolutional LSTM layers to extract features inside a high-level state, and extra LSTM layers to capture temporal dependencies between high-level states. We evaluate our model on the Opportunity dataset that has once been used in public activity recognition challenge. The results show that the proposed model has a good performance both in time series classification and prediction tasks.
data analysis, knowledge representation, learning (artificial intelligence), pattern classification, recurrent neural nets, time series

Y. Guo, Z. Wu and Y. Ji, "A Hybrid Deep Representation Learning Model for Time Series Classification and Prediction," 2017 3rd International Conference on Big Data Computing and Communications (BIGCOM), Chengdu, Sichuan, China, 2018, pp. 226-231.
96 ms
(Ver 3.3 (11022016))