Model layered optimization with contrastive learning for personalized federated learning☆
Dawei Xu , Chentao Lu , TianXin Chen , Baokun Zheng , Chuan Zhang , Liehuang Zhu , Jian Zhao
›› 2025, Vol. 11 ›› Issue (6) : 1973 -1982.
Model layered optimization with contrastive learning for personalized federated learning☆
In federated learning (FL), the distribution of data across different clients leads to the degradation of global model performance in training. Personalized Federated Learning (pFL) can address this problem through global model personalization. Researches over the past few years have calibrated differences in weights across the entire model or optimized only individual layers of the model without considering that different layers of the whole neural network have different utilities, resulting in lagged model convergence and inadequate personalization in non-IID data. In this paper, we propose model layered optimization for feature extractor and classifier (pFedEC), a novel pFL training framework personalized for different layers of the model. Our study divides the model layers into the feature extractor and classifier. We initialize the model's classifiers during model training, while making the local model's feature extractors learn the representation of the global model's feature extractors to correct each client's local training, integrating the utilities of the different layers in the entire model. Our extensive experiments show that pFedEC achieves 92.95% accuracy on CIFAR-10, outperforming existing pFL methods by approximately 1.8%. On CIFAR-100 and Tiny-ImageNet, pFedEC improves the accuracy by at least 4.2%, reaching 73.02% and 28.39%, respectively.
Federated learning (FL) / Personalized federated learning (pFL) / Contrastive learning / Theoretical analysis
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
| [51] |
|
| [52] |
|
| [53] |
|
| [54] |
|
| [55] |
|
/
| 〈 |
|
〉 |