Properties and iterative methods for the lasso and its variants

Hong-Kun Xu

Chinese Annals of Mathematics, Series B ›› 2014, Vol. 35 ›› Issue (3) : 501 -518.

PDF
Chinese Annals of Mathematics, Series B ›› 2014, Vol. 35 ›› Issue (3) : 501 -518. DOI: 10.1007/s11401-014-0829-9
Article

Properties and iterative methods for the lasso and its variants

Author information +
History +
PDF

Abstract

The lasso of Tibshirani (1996) is a least-squares problem regularized by the 1 norm. Due to the sparseness promoting property of the 1 norm, the lasso has been received much attention in recent years. In this paper some basic properties of the lasso and two variants of it are exploited. Moreover, the proximal method and its variants such as the relaxed proximal algorithm and a dual method for solving the lasso by iterative algorithms are presented.

Keywords

Lasso / Elastic net / Smooth-lasso / 1 regularization / Sparsity / Proximal method / Dual method / Projection / Thresholding

Cite this article

Download citation ▾
Hong-Kun Xu. Properties and iterative methods for the lasso and its variants. Chinese Annals of Mathematics, Series B, 2014, 35(3): 501-518 DOI:10.1007/s11401-014-0829-9

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Candes E J, Romberg J, Tao T. Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inform. Theory, 2006, 52(2): 489-509

[2]

Candes E J, Romberg J, Tao T. Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Applied Math., 2006, 59(2): 1207-1223

[3]

Candes E J, Tao T. Near-optimal signal recovery from random projections: Universal encoding strategies?. IEEE Trans. Inform. Theory, 2006, 52(12): 5406-5425

[4]

Candes E J, Wakin M B. An introduction to compressive sampling. IEEE Signal Processing Magazine, 2008 21-30

[5]

SIAM News, 2006, 39 9

[6]

Combettes P L, Wajs R. Signal recovery by proximal forward-backward splitting. Multiscale Model. Simul., 2005, 4(4): 1168-1200

[7]

Donoho D. Compressed sensing. IEEE Trans. Inform. Theory, 2006, 52(4): 1289-1306

[8]

Friedman, J., Hastie, T. and Tibshirani, R., A note on the group lasso and a sparse group lasso, arXiv:1001.0736V1.

[9]

Geobel K, Kirk W A. Topics in Metric Fixed Point Theory. Cambridge Studies in Advanced Mathematics, Vol. 28, 1990

[10]

Hebiri M, van de Geer S. The smooth-lasso and other 1+ 2-penalized methods. Electron. J. Statist., 2011, 5: 1184-1226

[11]

Marino G, Xu H K. Convergence of generalized proximal point algorithms. Comm. Pure Appl. Anal., 2004, 3(3): 791-808

[12]

Micchelli C A, Shen L, Xu Y. Proximity algorithms for image models: Denoising. Inverse Problems, 2011, 27: 045009

[13]

Moreau J J. Proprietes des applications “prox”. C. R. Acad. Sci. Paris Ser. A Math., 1963, 256: 1069-1071

[14]

Moreau J J. Proximite et dualite dans un espace hilbertien. Bull. Soc. Math. France, 1965, 93: 272-299

[15]

Raasch T. On the L-curve criterion in 1 regularization of linear discrete ill-posed problems. International Conference on Inverse Problems and Related Topics, Nanjing, 2012

[16]

Tibshirani R. Regression shrinkage and selection via the lasso. J. Royal Statist. Soc. Ser. B, 1996, 58: 267-288

[17]

Tibshirani R, Saunders M, Rosset R Sparsity and smoothness via the fused lasso. J. Royal Statist. Soc., Ser. B, 2005, 67: 91-108

[18]

Xu H K. Averaged mappings and the gradient-projection algorithm. J. Optim. Theory Appl., 2011, 150: 360-378

[19]

Yuan M, Lin Y. Model selection and estimation in regression with grouped variables. J. Royal Statist. Soc., Ser. B, 2006, 68: 49-67

[20]

Zou H, Hastie T. Regularization and variable selection via the elastic net. J. Royal Statist. Soc., Ser. B, 2005, 67: 301-320

AI Summary AI Mindmap
PDF

134

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/