Rate distortion optimization for adaptive gradient quantization in federated learning
Guojun Chen , Kaixuan Xie , Wenqiang Luo , Yinfei Xu , Lun Xin , Tiecheng Song , Jing Hu
›› 2024, Vol. 10 ›› Issue (6) : 1813 -1825.
Rate distortion optimization for adaptive gradient quantization in federated learning
Federated Learning (FL) is an emerging machine learning framework designed to preserve privacy. However, the continuous updating of model parameters over uplink channels with limited throughput leads to a huge communication overload, which is a major challenge for FL. To address this issue, we propose an adaptive gradient quantization approach that enhances communication efficiency. Aiming to minimize the total communication costs, we consider both the correlation of gradients between local clients and the correlation of gradients between communication rounds, namely, in the time and space dimensions. The compression strategy is based on rate distortion theory, which allows us to find an optimal quantization strategy for the gradients. To further reduce the computational complexity, we introduce the Kalman filter into the proposed approach. Finally, numerical results demonstrate the effectiveness and robustness of the proposed rate-distortion optimization adaptive gradient quantization approach in significantly reducing the communication costs when compared to other quantization methods.
Federated learning / Communication efficiency / Adaptive quantization / Rate distortion
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
J. Konečny, |
| [41] |
speedtest. net,Speedtest United States market report, https://www.speedtest.net/reports/united-states/, November 2023. (Accessed 8 January 2024). |
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
/
| 〈 |
|
〉 |