The Community for Technology Leaders
2014 IEEE International Conference on Data Mining (ICDM) (2014)
Shenzhen, China
Dec. 14, 2014 to Dec. 17, 2014
ISSN: 1550-4786
ISBN: 978-1-4799-4303-6
pp: 680-686
Lasso simultaneously conducts variable selection and supervised regression. In this paper, we extend Lasso to multiple output prediction, which belongs to the categories of structured learning. Though structured learning makes use of both input and output simultaneously, the joint feature mapping in current framework of structured learning is usually application-specific. As a result, ad hoc heuristics have to be employed to design different joint feature mapping functions for different applications, which results in the lackness of generalization ability for multiple output prediction. To address this limitation, in this paper, we propose to augment Lasso with output by decoupling the joint feature mapping function of traditional structured learning. The contribution of this paper is three-fold: 1) The augmented Lasso conducts regression and variable selection on both the input and output features, and thus the learned model could fit an output with both the selected input variables and the other correlated outputs. 2) To be more general, we set up nonlinear dependencies among output variables by generalized Lasso. 3) Moreover, the Augmented Lagrangian Method (ALM) with Alternating Direction Minimizing (ADM) strategy is used to find the optimal model parameters. The extensive experimental results demonstrate the effectiveness of the proposed method.
Correlation, Kernel, Training, Joints, Vectors, Linear programming, Input variables

C. Zhang, Y. Han, X. Guo and X. Cao, "Output Feature Augmented Lasso," 2014 IEEE International Conference on Data Mining (ICDM), Shenzhen, China, 2014, pp. 680-686.
97 ms
(Ver 3.3 (11022016))