CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 1999 vol.21 Issue No.12 - December
Issue No.12 - December (1999 vol.21)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.817407
<p><b>Abstract</b>—This article introduces a new tool for exploratory data analysis and data mining called <it>Scale-Sensitive Gated Experts</it> (SSGE) which can partition a complex nonlinear regression surface into a set of simpler surfaces (which we call features). The set of simpler surfaces has the property that each element of the set can be efficiently modeled by a single feedforward neural network. The degree to which the regression surface is partitioned is controlled by an external scale parameter. The SSGE consists of a nonlinear gating network and several competing nonlinear experts. Although SSGE is similar to the mixture of experts model of Jacobs et al. [<ref type="bib" rid="bibI126810">10</ref>] the mixture of experts model gives only one partitioning of the input-output space, and thus a single set of features, whereas the SSGE gives the user the capability to discover families of features. One obtains a new member of the family of features for each setting of the scale parameter. In this paper, we derive the Scale-Sensitive Gated Experts and demonstrate its performance on a time series segmentation problem. The main results are: 1) the scale parameter controls the granularity of the features of the regression surface, 2) similar features are modeled by the same expert and different kinds of features are modeled by different experts, and 3) for the time series problem, the SSGE finds different regimes of behavior, each with a specific and interesting interpretation.</p>
Mixture of experts, mixture model, classification and regression, time series segmentation, neural networks.
Renjeng Su, Ashok N. Srivastava, "Data Mining for Features Using Scale-Sensitive Gated Experts", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.21, no. 12, pp. 1268-1279, December 1999, doi:10.1109/34.817407