Federated Sufficient Dimension Reduction Through High-Dimensional Sparse Sliced Inverse Regression
Wenquan Cui , Yue Zhao , Jianjun Xu , Haoyang Cheng
Communications in Mathematics and Statistics ›› 2023, Vol. 13 ›› Issue (3) : 719 -756.
Federated Sufficient Dimension Reduction Through High-Dimensional Sparse Sliced Inverse Regression
Federated learning has become a popular tool in the big data era nowadays. It trains a centralized model based on data from different clients while keeping data decentralized. In this paper, we propose a federated sparse sliced inverse regression algorithm for the first time. Our method can simultaneously estimate the central dimension reduction subspace and perform variable selection in a federated setting. We transform this federated high-dimensional sparse sliced inverse regression problem into a convex optimization problem by constructing the covariance matrix safely and losslessly. We then use a linearized alternating direction method of multipliers algorithm to estimate the central subspace. We also give approaches of Bayesian information criterion and holdout validation to ascertain the dimension of the central subspace and the hyper-parameter of the algorithm. We establish an upper bound of the statistical error rate of our estimator under the heterogeneous setting. We demonstrate the effectiveness of our method through simulations and real world applications.
Federated learning / Sliced inverse regression / Sufficient dimension reduction / Variable selection / Mathematical Sciences / Statistics / Information and Computing Sciences / Artificial Intelligence and Image Processing
School of Mathematical Sciences, University of Science and Technology of China and Springer-Verlag GmbH Germany, part of Springer Nature
/
| 〈 |
|
〉 |