The Community for Technology Leaders
Green Image
Issue No. 12 - December (2003 vol. 25)
ISSN: 0162-8828
pp: 1570-1581
<p><b>Abstract</b>—This paper presents a method for effectively using unlabeled sequential data in the learning of hidden Markov models (HMMs). With the conventional approach, class labels for unlabeled data are assigned <it>deterministically</it> by HMMs learned from labeled data. Such labeling often becomes unreliable when the number of labeled data is small. We propose an extended Baum-Welch (EBW) algorithm in which the labeling is undertaken <it>probabilistically</it> and <it>iteratively</it> so that the labeled and unlabeled data likelihoods are improved. Unlike the conventional approach, the EBW algorithm guarantees convergence to a local maximum of the likelihood. Experimental results on gesture data and speech data show that when labeled training data are scarce, by using unlabeled data, the EBW algorithm improves the classification performance of HMMs more robustly than the conventional naive labeling (NL) approach.</p>
Unlabeled data, sequential data, hidden Markov models, extended Baum-Welch algorithm.

M. Inoue and N. Ueda, "Exploitation of Unlabeled Sequences in Hidden Markov Models," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 25, no. , pp. 1570-1581, 2003.
90 ms
(Ver 3.3 (11022016))