Machine Learning and Applications, Fourth International Conference on (2009)
Miami Beach, Florida
Dec. 13, 2009 to Dec. 15, 2009
We propose a modified discrete HMM that handles multimodalities. We assume that the feature space is partitioned into subspaces generated by different sources of information. To combine these heteregoneous modalities we propose a multi-stream discrete HMM that assigns a relevance weight to each subspace. The relevance weights are set local and depend on the symbols and the states. In particular, we associate a partial probability with each symbol in each subspace. The overall observation state probability is then computed as an aggregation of the partial probabilities and their objective relevance weights based on a linear combination. The Minimum Classification Error (MCE) objective based on the Gradient Probabilistic Descent (GPD) optimization algorithm is reformulated to derive the update equations for the relevance weights and the partial state probabilities. The proposed approach is validated using synthetic and real data sets. The results are shown to outperform the baseline discrete HMM that treats all streams equally important.
Hidden Markov Models, Multi-stream, Discriminative training, minimum classification error
P. Gader, H. Frigui and O. Missaoui, "Discriminative Multi-stream Discrete Hidden Markov Models," Machine Learning and Applications, Fourth International Conference on(ICMLA), Miami Beach, Florida, 2009, pp. 178-183.