2015 IEEE International Conference on Data Mining (ICDM) (2015)
Atlantic City, NJ, USA
Nov. 14, 2015 to Nov. 17, 2015
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/ICDM.2015.60
Sequential data modeling has received growing interests due to its impact on real world problems. Sequential data is ubiquitous -- financial transactions, advertise conversions and disease evolution are examples of sequential data. A long-standing challenge in sequential data modeling is how to capture the strong hidden correlations among complex features in high volumes. The sparsity and skewness in the features extracted from sequential data also add to the complexity of the problem. In this paper, we address these challenges from both discriminative and generative perspectives, and propose novel stochastic learning algorithms to model nonlinear variances from static time frames and their transitions. The proposed model, Deep Recurrent Network (DRN), can be trained in an unsupervised fashion to capture transitions, or in a discriminative fashion to conduct sequential labeling. We analyze the conditional independence of each functional module and tackle the diminishing gradient problem by developing a two-pass training algorithm. Extensive experiments on both simulated and real-world dynamic networks show that the trained DRN outperforms all baselines in the sequential classification task and obtains excellent performance in the regression task.
Training, Hidden Markov models, Data models, Computational modeling, Mathematical model, Data mining, Heuristic algorithms
X. Li, X. Jia, H. Li, H. Xiao, J. Gao and A. Zhang, "DRN: Bringing Greedy Layer-Wise Training into Time Dimension," 2015 IEEE International Conference on Data Mining (ICDM), Atlantic City, NJ, USA, 2015, pp. 859-864.