Pairwise Fusion Approach Incorporating Prior Constraint Information

Yaguang Li , Baisuo Jin

Communications in Mathematics and Statistics ›› 2020, Vol. 8 ›› Issue (1) : 47 -62.

PDF
Communications in Mathematics and Statistics ›› 2020, Vol. 8 ›› Issue (1) : 47 -62. DOI: 10.1007/s40304-018-0168-3
Article

Pairwise Fusion Approach Incorporating Prior Constraint Information

Author information +
History +
PDF

Abstract

In this paper, we explore sparsity and homogeneity of regression coefficients incorporating prior constraint information. The sparsity means that a small fraction of regression coefficients is nonzero, and the homogeneity means that regression coefficients are grouped and have exactly the same value in each group. A general pairwise fusion approach is proposed to deal with the sparsity and homogeneity detection when combining prior convex constraints. We develop a modified alternating direction method of multipliers algorithm to obtain the estimators and demonstrate its convergence. The efficiency of both sparsity and homogeneity detection can be improved by combining the prior information. Our proposed method is further illustrated by simulation studies and analysis of an ozone dataset.

Keywords

Alternating direction method of multipliers / Prior constraint information / Sparsity / Homogeneity / Linear regression

Cite this article

Download citation ▾
Yaguang Li, Baisuo Jin. Pairwise Fusion Approach Incorporating Prior Constraint Information. Communications in Mathematics and Statistics, 2020, 8(1): 47-62 DOI:10.1007/s40304-018-0168-3

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Boyd S, Parikh N, Chu E, Peleato B, Eckstein J . Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends® Mach. Learn.. 2011, 3 1 1-122

[2]

Breiman L, Friedman JH. Estimating optimal transformations for multiple regression and correlation. J. Am. Stat. Assoc.. 1985, 80 391 580-598

[3]

Fan J, Li R. Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc.. 2001, 96 456 1348-1360

[4]

Fan Y, Tang CY. Tuning parameter selection in high dimensional penalized likelihood. J. R. Stat. Soc. Ser. B (Stat. Methodol.). 2013, 75 3 531-552

[5]

Geyer, C.J.: On the asymptotics of convex stochastic optimization. Unpublished manuscript (1996)

[6]

Ke ZT, Fan J, Wu Y. Homogeneity pursuit. J. Am. Stat. Assoc.. 2015, 110 509 175-194

[7]

Ma S, Huang J. A concave pairwise fusion approach to subgroup analysis. J. Am. Stat. Assoc.. 2017, 112 517 410-423

[8]

Silvapulle MJ, Sen PK. Constrained Statistical Inference: Order, Inequality, and Shape Constraints. 2011 Hoboken: Wiley

[9]

Stahlecker, P.: A priori Information und Minimax-Schätzung im linearen Regressionsmodell. Athenäum (1987)

[10]

Tibshirani R. Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B (Methodol.). 1996, 58 267-288

[11]

Tibshirani R, Saunders M, Rosset S, Zhu J, Knight K. Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. Ser. B (Stat. Methodol.). 2005, 67 1 91-108

[12]

Tseng P. Convergence of a block coordinate descent method for nondifferentiable minimization. J. Optim. Theory Appl.. 2001, 109 3 475-494

[13]

Yuan M, Lin Y. Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B (Stat. Methodol.). 2006, 68 1 49-67

[14]

Zhang CH. Nearly unbiased variable selection under minimax concave penalty. Ann. Stat.. 2010, 38 2 894-942

[15]

Zhu Y, Shen X, Pan W. Simultaneous grouping pursuit and feature selection over an undirected graph. J. Am. Stat. Assoc.. 2013, 108 502 713-725

[16]

Zou H. The adaptive lasso and its oracle properties. J. Am. Stat. Assoc.. 2006, 101 476 1418-1429

AI Summary AI Mindmap
PDF

209

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/