A comprehensive survey of federated transfer learning: challenges, methods and applications
Wei GUO , Fuzhen ZHUANG , Xiao ZHANG , Yiqi TONG , Jin DONG
Front. Comput. Sci. ›› 2024, Vol. 18 ›› Issue (6) : 186356
A comprehensive survey of federated transfer learning: challenges, methods and applications
Federated learning (FL) is a novel distributed machine learning paradigm that enables participants to collaboratively train a centralized model with privacy preservation by eliminating the requirement of data sharing. In practice, FL often involves multiple participants and requires the third party to aggregate global information to guide the update of the target participant. Therefore, many FL methods do not work well due to the training and test data of each participant may not be sampled from the same feature space and the same underlying distribution. Meanwhile, the differences in their local devices (system heterogeneity), the continuous influx of online data (incremental data), and labeled data scarcity may further influence the performance of these methods. To solve this problem, federated transfer learning (FTL), which integrates transfer learning (TL) into FL, has attracted the attention of numerous researchers. However, since FL enables a continuous share of knowledge among participants with each communication round while not allowing local data to be accessed by other participants, FTL faces many unique challenges that are not present in TL. In this survey, we focus on categorizing and reviewing the current progress on federated transfer learning, and outlining corresponding solutions and applications. Furthermore, the common setting of FTL scenarios, available datasets, and significant related research are summarized in this survey.
federated transfer learning / federated learning / transfer learning / survey
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
Gong B, Shi Y, Sha F, Grauman K. Geodesic flow kernel for unsupervised domain adaptation. In: Proceedings of 2012 IEEE Conference on Computer Vision and Pattern Recognition. 2012, 2066−2073 |
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
| [51] |
|
| [52] |
|
| [53] |
|
| [54] |
|
| [55] |
|
| [56] |
|
| [57] |
|
| [58] |
|
| [59] |
|
| [60] |
|
| [61] |
|
| [62] |
|
| [63] |
|
| [64] |
|
| [65] |
|
| [66] |
|
| [67] |
|
| [68] |
|
| [69] |
|
| [70] |
|
| [71] |
|
| [72] |
|
| [73] |
|
| [74] |
|
| [75] |
|
| [76] |
|
| [77] |
|
| [78] |
|
| [79] |
|
| [80] |
|
| [81] |
|
| [82] |
|
| [83] |
|
| [84] |
|
| [85] |
|
| [86] |
|
| [87] |
|
| [88] |
|
| [89] |
|
| [90] |
|
| [91] |
|
| [92] |
|
| [93] |
|
| [94] |
|
| [95] |
Nishio T, Yonetani R. Client selection for federated learning with heterogeneous resources in mobile edge. In: Proceedings of 2019 IEEE International Conference on Communications (ICC). 2019, 1−7 |
| [96] |
|
| [97] |
|
| [98] |
|
| [99] |
|
| [100] |
|
| [101] |
|
| [102] |
|
| [103] |
|
| [104] |
|
| [105] |
|
| [106] |
|
| [107] |
|
| [108] |
|
| [109] |
|
| [110] |
|
| [111] |
|
| [112] |
|
| [113] |
|
| [114] |
|
| [115] |
|
| [116] |
|
| [117] |
|
| [118] |
|
| [119] |
|
| [120] |
|
| [121] |
|
| [122] |
|
| [123] |
|
| [124] |
|
| [125] |
|
| [126] |
|
| [127] |
|
| [128] |
|
| [129] |
|
| [130] |
Yao X, Sun L. Continual local training for better initialization of federated models. In: Proceedings of 2020 IEEE International Conference on Image Processing (ICIP). 2020, 1736−1740 |
| [131] |
|
| [132] |
|
| [133] |
|
| [134] |
|
| [135] |
|
| [136] |
|
| [137] |
|
| [138] |
|
| [139] |
|
| [140] |
|
| [141] |
|
| [142] |
|
| [143] |
|
| [144] |
|
| [145] |
|
| [146] |
|
| [147] |
|
| [148] |
|
| [149] |
|
| [150] |
|
| [151] |
|
| [152] |
|
| [153] |
|
| [154] |
|
| [155] |
|
| [156] |
|
| [157] |
|
| [158] |
|
| [159] |
|
| [160] |
|
| [161] |
|
| [162] |
|
| [163] |
|
| [164] |
|
| [165] |
|
| [166] |
|
| [167] |
|
| [168] |
|
| [169] |
|
| [170] |
|
| [171] |
|
| [172] |
|
| [173] |
|
| [174] |
|
| [175] |
|
| [176] |
|
| [177] |
|
| [178] |
|
| [179] |
|
| [180] |
|
| [181] |
|
| [182] |
|
| [183] |
|
| [184] |
|
| [185] |
|
| [186] |
|
| [187] |
|
| [188] |
|
| [189] |
|
| [190] |
|
| [191] |
|
| [192] |
|
| [193] |
|
| [194] |
|
| [195] |
|
| [196] |
|
| [197] |
|
| [198] |
|
| [199] |
|
| [200] |
|
| [201] |
|
| [202] |
|
| [203] |
|
| [204] |
|
| [205] |
|
| [206] |
|
| [207] |
|
| [208] |
|
| [209] |
|
| [210] |
|
| [211] |
|
| [212] |
|
| [213] |
|
| [214] |
|
| [215] |
|
| [216] |
|
| [217] |
|
| [218] |
|
| [219] |
|
| [220] |
|
| [221] |
|
| [222] |
|
| [223] |
|
| [224] |
|
| [225] |
|
| [226] |
|
| [227] |
|
| [228] |
|
| [229] |
|
| [230] |
|
| [231] |
|
| [232] |
|
| [233] |
|
| [234] |
|
| [235] |
|
| [236] |
|
| [237] |
|
| [238] |
|
| [239] |
|
| [240] |
|
| [241] |
|
| [242] |
|
| [243] |
|
| [244] |
|
| [245] |
|
| [246] |
|
| [247] |
|
| [248] |
|
| [249] |
|
| [250] |
|
| [251] |
|
| [252] |
|
| [253] |
|
| [254] |
|
| [255] |
|
| [256] |
|
| [257] |
|
| [258] |
|
| [259] |
|
| [260] |
|
| [261] |
|
| [262] |
|
| [263] |
|
| [264] |
|
| [265] |
|
| [266] |
|
| [267] |
|
| [268] |
|
| [269] |
|
| [270] |
|
| [271] |
|
| [272] |
|
| [273] |
|
| [274] |
|
| [275] |
|
| [276] |
|
| [277] |
|
| [278] |
|
| [279] |
|
| [280] |
|
| [281] |
|
| [282] |
|
| [283] |
|
| [284] |
|
| [285] |
|
| [286] |
|
| [287] |
|
| [288] |
|
| [289] |
|
| [290] |
|
The Author(s) 2024. This article is published with open access at link.springer.com and journal.hep.com.cn
/
| 〈 |
|
〉 |