Publication 2013 Issue No. 9 - Sept. Abstract - Efficient Methods for Overlapping Group Lasso
 This Article Share Bibliographic References Add to: Digg Furl Spurl Blink Simpy Google Del.icio.us Y!MyWeb Search Similar Articles Articles by Lei Yuan Articles by Jun Liu Articles by Jieping Ye
Efficient Methods for Overlapping Group Lasso
Sept. 2013 (vol. 35 no. 9)
pp. 2104-2116
 ASCII Text x Lei Yuan, Jun Liu, Jieping Ye, "Efficient Methods for Overlapping Group Lasso," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 9, pp. 2104-2116, Sept., 2013.
 BibTex x @article{ 10.1109/TPAMI.2013.17,author = {Lei Yuan and Jun Liu and Jieping Ye},title = {Efficient Methods for Overlapping Group Lasso},journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence},volume = {35},number = {9},issn = {0162-8828},year = {2013},pages = {2104-2116},doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2013.17},publisher = {IEEE Computer Society},address = {Los Alamitos, CA, USA},}
 RefWorks Procite/RefMan/Endnote x TY - JOURJO - IEEE Transactions on Pattern Analysis and Machine IntelligenceTI - Efficient Methods for Overlapping Group LassoIS - 9SN - 0162-8828SP2104EP2116EPD - 2104-2116A1 - Lei Yuan, A1 - Jun Liu, A1 - Jieping Ye, PY - 2013KW - OptimizationKW - ConvergenceKW - IndexesKW - Algorithm design and analysisKW - AccelerationKW - SiliconKW - Convex functionsKW - difference of convex programmingKW - Sparse learningKW - overlapping group LassoKW - proximal operatorVL - 35JA - IEEE Transactions on Pattern Analysis and Machine IntelligenceER -
Lei Yuan, Arizona State University, Tempe
Jun Liu, Arizona State University, Tempe
Jieping Ye, Arizona State University, Tempe
The group Lasso is an extension of the Lasso for feature selection on (predefined) nonoverlapping groups of features. The nonoverlapping group structure limits its applicability in practice. There have been several recent attempts to study a more general formulation where groups of features are given, potentially with overlaps between the groups. The resulting optimization is, however, much more challenging to solve due to the group overlaps. In this paper, we consider the efficient optimization of the overlapping group Lasso penalized problem. We reveal several key properties of the proximal operator associated with the overlapping group Lasso, and compute the proximal operator by solving the smooth and convex dual problem, which allows the use of the gradient descent type of algorithms for the optimization. Our methods and theoretical results are then generalized to tackle the general overlapping group Lasso formulation based on the $(\ell_q)$ norm. We further extend our algorithm to solve a nonconvex overlapping group Lasso formulation based on the capped norm regularization, which reduces the estimation bias introduced by the convex penalty. We have performed empirical evaluations using both a synthetic and the breast cancer gene expression dataset, which consists of 8,141 genes organized into (overlapping) gene sets. Experimental results show that the proposed algorithm is more efficient than existing state-of-the-art algorithms. Results also demonstrate the effectiveness of the nonconvex formulation for overlapping group Lasso.
Index Terms:
Optimization,Convergence,Indexes,Algorithm design and analysis,Acceleration,Silicon,Convex functions,difference of convex programming,Sparse learning,overlapping group Lasso,proximal operator
Citation:
Lei Yuan, Jun Liu, Jieping Ye, "Efficient Methods for Overlapping Group Lasso," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 9, pp. 2104-2116, Sept. 2013, doi:10.1109/TPAMI.2013.17