FedPD: personalized federated learning based on partial distillation

Xu YANG , Ji-Yuan FENG , Song-Yue GUO , Bin-Xing FANG , Qing LIAO

Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (3) : 2003604

PDF (2008KB)
Front. Comput. Sci. ›› 2026, Vol. 20 ›› Issue (3) : 2003604 DOI: 10.1007/s11704-025-40840-4
Information Systems
RESEARCH ARTICLE

FedPD: personalized federated learning based on partial distillation

Author information +
History +
PDF (2008KB)

Abstract

In recent years, personalized federated learning (PFL) has gained widespread attention for its robust performance in handling heterogeneous data. However, most PFL methods require client models to share the same architecture, which is impractical in real-world scenarios. Therefore, federated distillation learning is proposed, which allows clients to use different architecture models for FL training. Nevertheless, these methods do not consider the importance of different distillation knowledge aggregated by the client knowledge, resulting in poor client collaboration performance. In this paper, we propose a novel personalized federated learning method based on partial distillation (FedPD) that assesses the relevance of the different distillation knowledge and ensemble knowledge for each client, thereby achieving selective knowledge transfer. Specifically, FedPD contains two key modules. One is the partial knowledge transfer (PKT) which uses the partial distillation coefficient to identify the importance of each distillation knowledge to select more valuable distillation knowledge. The other is the partial knowledge ensemble (PKE), which maintains a server model for each client to extract distillation knowledge to guide the client. Extensive experiments on real-world datasets in various experimental settings show that FedPD significantly improves client model performance compared to state-of-the-art federated learning methods.

Graphical abstract

Keywords

federated learning / knowledge distillation / model heterogeneity

Cite this article

Download citation ▾
Xu YANG, Ji-Yuan FENG, Song-Yue GUO, Bin-Xing FANG, Qing LIAO. FedPD: personalized federated learning based on partial distillation. Front. Comput. Sci., 2026, 20(3): 2003604 DOI:10.1007/s11704-025-40840-4

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

McMahan B, Moore E, Ramage D, Hampson S, Arcas B A Y. Communication-efficient learning of deep networks from decentralized data. In: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. 2017, 1273−1282

[2]

Antunes R S, da Costa A C, Küderle A, Yari I A, Eskofier B . Federated learning for healthcare: systematic review and architecture proposal. ACM Transactions on Intelligent Systems and Technology (TIST), 2022, 13( 4): 54

[3]

Xu J, Glicksberg B S, Su C, Walker P, Bian J, Wang F . Federated learning for healthcare informatics. Journal of Healthcare Informatics Research, 2021, 5( 1): 1–19

[4]

Jiang J C, Kantarci B, Oktug S, Soyata T . Federated learning in smart city sensing: challenges and opportunities. Sensors, 2020, 20( 21): 6230

[5]

Pandya S, Srivastava G, Jhaveri R, Babu M R, Bhattacharya S, Maddikunta P K R, Mastorakis S, Piran M J, Gadekallu T R . Federated learning for smart cities: a comprehensive survey. Sustainable Energy Technologies and Assessments, 2023, 55: 102987

[6]

Yang L, Tan B, Zheng V W, Chen K, Yang Q. Federated recommendation systems. In: Yang Q, Fan L, Yu H, eds. Federated Learning: Privacy and Incentive. Cham: Springer, 2020, 225−239

[7]

Wang Q, Yin H, Chen T, Yu J, Zhou A, Zhang X . Fast-adapting and privacy-preserving federated recommender system. The VLDB Journal, 2022, 31( 5): 877–896

[8]

Tan A Z, Yu H, Cui L, Yang Q . Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34( 12): 9587–9603

[9]

Luo M, Chen F, Hu D, Zhang Y, Liang J, Feng J. No fear of heterogeneity: classifier calibration for federated learning with non-IID data. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021, 5972−5984

[10]

Ye M, Fang X, Du B, Yuen P C, Tao D . Heterogeneous federated learning: state-of-the-art and research challenges. ACM Computing Surveys, 2024, 56( 3): 79

[11]

Sun N, Wang W, Tong Y, Liu K . Blockchain based federated learning for intrusion detection for internet of things. Frontiers of Computer Science, 2024, 18( 5): 185328

[12]

Liu F, Zheng Z, Shi Y, Tong Y, Zhang Y . A survey on federated learning: a perspective from multi-party computation. Frontiers of Computer Science, 2024, 18( 1): 181336

[13]

Xu J, Wong R C W. Efficiently answering top-k window aggregate queries: calculating coverage number sequences over hierarchical structures. In: Proceedings of the 39th IEEE International Conference on Data Engineering. 2023, 1300−1312

[14]

Dhasarathan C, Hasan M K, Islam S, Abdullah S, Khapre S, Singh D, Alsulami A A, Alqahtani A. User privacy prevention model using supervised federated learning-based block chain approach for internet of medical things. CAAI Transactions on Intelligence Technology, 2023

[15]

Li T, Hu S, Beirami A, Smith V. Ditto: fair and robust federated learning through personalization. In: Proceedings of the 38th International Conference on Machine Learning. 2021, 6357−6368

[16]

Dinh C T, Tran N H, Nguyen T D. Personalized federated learning with Moreau envelopes. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 1796

[17]

Li Q, He B, Song D. Model-contrastive federated learning. In: Proceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021, 10708−10717

[18]

Huang Y, Chu L, Zhou Z, Wang L, Liu J, Pei J, Zhang Y. Personalized cross-silo federated learning on non-IID data. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021, 7865−7873

[19]

Ruan Y, Joe-Wong C. FedSoft: soft clustered federated learning with proximal local updating. In: Proceedings of the 36th AAAI Conference on Artificial Intelligence. 2022, 8124−8131

[20]

Collins L, Hassani H, Mokhtari A, Shakkottai S. Exploiting shared representations for personalized federated learning. In: Proceedings of the 38th International Conference on Machine Learning. 2021, 2089−2099

[21]

Oh J, Kim S, Yun S Y. FedBABU: toward enhanced representation for federated image classification. In: Proceedings of the 10th International Conference on Learning Representations. 2022, 1−29

[22]

Zhang J, Hua Y, Wang H, Song T, Xue Z, Ma R, Guan H. FedALA: adaptive local aggregation for personalized federated learning. In: Proceedings of the 37th AAAI Conference on Artificial Intelligence. 2023, 11237−11244

[23]

Li L, Gou J, Yu B, Du L, Tao Z Y D. Federated distillation: a survey. 2024, arXiv preprint arXiv: 2404.08564

[24]

Lin T, Kong L, Stich S U, Jaggi M. Ensemble distillation for robust model fusion in federated learning. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 198

[25]

Li D, Wang J. FedMD: heterogenous federated learning via model distillation. In: Proceedings of the 33rd Conference on Neural Information Processing Systems. 2019, 1–8

[26]

Makhija D, Han X, Ho N, Ghosh J. Architecture agnostic federated learning for neural networks. In: Proceedings of the 39th International Conference on Machine Learning. 2022, 14860−14870

[27]

Cho Y J, Manoel A, Joshi G, Sim R, Dimitriadis D. Heterogeneous ensemble knowledge transfer for training large models in federated learning. In: Proceedings of the 31st International Joint Conference on Artificial Intelligence. 2022, 2881−2887

[28]

Zhang J, Guo S, Ma X, Wang H, Xu W, Wu F. Parameterized knowledge transfer for personalized federated learning. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021, 10092−10104

[29]

Ghosh A, Chung J, Yin D, Ramchandran K . An efficient framework for clustered federated learning. IEEE Transactions on Information Theory, 2022, 68( 12): 8076–8091

[30]

Liu B, Guo Y, Chen X. PFA: privacy-preserving federated adaptation for effective model personalization. In: Proceedings of the Web Conference 2021. 2021, 923−934

[31]

Sun B, Huo H, Yang Y, Bai B. PartialFed: cross-domain personalized federated learning via partial initialization. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021, 23309−23320

[32]

Wang H, Li Y, Xu W, Li R, Zhan Y, Zeng Z. DaFKD: domain-aware federated knowledge distillation. In: Proceedings of 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2023, 20412−20421

[33]

Pfeiffer K, Rapp M, Khalili R, Henkel J . Federated learning for computationally constrained heterogeneous devices: a survey. ACM Computing Surveys, 2023, 55( 14s): 334

[34]

Gao D, Yao X, Yang Q. A survey on heterogeneous federated learning. 2022, arXiv preprint arXiv: 2210.04505

[35]

He C, Annavaram M, Avestimehr S. Group knowledge transfer: federated learning of large CNNs at the edge. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 1180

[36]

Tan Y, Long G, Liu L, Zhou T, Lu Q, Jiang J, Zhang C. FedProto: federated prototype learning across heterogeneous clients. In: Proceedings of the 36th AAAI Conference on Artificial Intelligence. 2022, 8432−8440

[37]

Chen H Y, Chao W L. On bridging generic and personalized federated learning for image classification. In: Proceedings of the 10th International Conference on Learning Representations. 2022, 1−32

[38]

Krizhevsky A. Learning multiple layers of features from tiny images. University of Toronto, Dissertation, 2009

[39]

Cohen G, Afshar S, Tapson J, van Schaik A. EMNIST: extending MNIST to handwritten letters. In: Proceedings of 2017 International Joint Conference on Neural Networks. 2017, 2921−2926

[40]

Xiao H, Rasul K, Vollgraf R. Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. 2017, arXiv preprint arXiv: 1708.07747

[41]

Achituve I, Shamsian A, Navon A, Chechik G, Fetaya E. Personalized federated learning with Gaussian processes. In: Proceedings of the 35th Conference on Neural Information Processing Systems. 2021, 8392−8406

[42]

Chen H, Wang C, Vikalo H. The best of both worlds: accurate global and personalized models through federated learning with data-free hyper-knowledge distillation. In: Proceedings of the 11th International Conference on Learning Representations. 2023, 1−24

[43]

LeCun Y, Bottou L, Bengio Y, Haffner P . Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998, 86( 11): 2278–2324

[44]

He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition. 2016, 770−778

[45]

Sandler M, Howard A G, Zhu M, Zhmoginov A, Chen L C. MobileNetV2: inverted residuals and linear bottlenecks. In: Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2018, 4510−4520

[46]

Ma N, Zhang X, Zheng H T, Sun J. ShuffleNet V2: practical guidelines for efficient CNN architecture design. In: Proceedings of the 15th European Conference on Computer Vision. 2018, 122−138

[47]

Qin Z, Deng S, Zhao M, Yan X. FedAPEN: personalized cross-silo federated learning with adaptability to statistical heterogeneity. In: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2023, 1954−1964

[48]

Karimireddy S P, Kale S, Mohri M, Reddi S J, Stich S U, Suresh A T. Scaffold: stochastic controlled averaging for federated learning. In: Proceedings of the 37th International Conference on Machine Learning. 2020, 476

[49]

Gao L, Fu H, Li L, Chen Y, Xu M, Xu C Z. FedDC: federated learning with non-iid data via local drift decoupling and correction. In: Proceedings of 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022, 10102−10111

[50]

Yang Z, Zhang Y, Zheng Y, Tian X, Peng H, Liu T, Han B. FedFed: feature distillation against data heterogeneity in federated learning. In: Proceedings of the 37th International Conference on Neural Information Processing Systems. 2023, 2639

RIGHTS & PERMISSIONS

The Author(s) 2025. This article is published with open access at link.springer.com and journal.hep.com.cn

AI Summary AI Mindmap
PDF (2008KB)

Supplementary files

Highlights

960

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/