FedStrag: Straggler-aware federated learning for low resource devices

Aditya Kumar , Satish Narayana Srirama

›› 2025, Vol. 11 ›› Issue (4) : 1214 -1224.

PDF
›› 2025, Vol. 11 ›› Issue (4) :1214 -1224. DOI: 10.1016/j.dcan.2024.12.004
Research article
research-article

FedStrag: Straggler-aware federated learning for low resource devices

Author information +
History +
PDF

Abstract

Federated Learning (FL) has become a popular training paradigm in recent years. However, stragglers are critical bottlenecks in an Internet of Things (IoT) network while training. These nodes produce stale updates to the server, which slow down the convergence. In this paper, we studied the impact of the stale updates on the global model, which is observed to be significant. To address this, we propose a weighted averaging scheme, FedStrag, that optimizes the training with stale updates. The work is focused on training a model in an IoT network that has multiple challenges, such as resource constraints, stragglers, network issues, device heterogeneity, etc. To this end, we developed a time-bounded asynchronous FL paradigm that can train a model on the continuous inflow of data in the edge-fog-cloud continuum. To test the FedStrag approach, a model is trained with multiple stragglers scenarios on both Independent and Identically Distributed (IID) and non-IID datasets on Raspberry Pis. The experiment results suggest that the FedStrag outperforms the baseline FedAvg in all possible cases.

Keywords

Internet of things / Decentralized training / Fog computing / Federated learning / Distributed computing / Straggler

Cite this article

Download citation ▾
Aditya Kumar, Satish Narayana Srirama. FedStrag: Straggler-aware federated learning for low resource devices. , 2025, 11(4): 1214-1224 DOI:10.1016/j.dcan.2024.12.004

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

S.N. Srirama, A decade of research in fog computing: relevance, challenges, and future directions, Softw. Pract. Exp. 54 (1) (2024) 3-23.

[2]

S.M. Rajagopal, S. M., R. Buyya, Fedsdm: federated learning based smart decision making module for ecg data in iot integrated edge-fog-cloud computing environ-ments, Int. Things 22 (2023) 100784.

[3]

R. Buyya, S.N. Srirama, G. Casale, R. Calheiros, Y. Simmhan, B. Varghese, E. Gelenbe, B. Javadi, L.M. Vaquero, M.A.S. Netto, A.N. Toosi, M.A. Rodriguez, I.M. Llorente, S.D.C.D. Vimercati, P. Samarati, D. Milojicic, C. Varela, R. Bahsoon, M.D.D. Assun- cao, O. Rana, W. Zhou, H. Jin, W. Gentzsch, A.Y. Zomaya, H. Shen, A manifesto for future generation cloud computing: research directions for the next decade, ACM Comput. Surv. 51 (5) (2018) 105.

[4]

S. Trindade, L.F. Bittencourt, N.L. da Fonseca, Resource management at the network edge for federated learning, Digit.commun. Netw. 10 (3) (2024) 765-782.

[5]

N. Jabeen, R. Hao, A. Niaz, M.U. Shoukat, F. Niaz, M.A. Khan, Autonomous vehi-cle health monitoring based on cloud-fog computing, in: International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE), 2022, pp. 1-6.

[6]

A. Kumar, S.N. Srirama, Fidel: fog integrated federated learning framework to train neural networks, Softw. Pract. Exp. 54 (2) (2024) 186-207.

[7]

A. Kumar, S.N. Srirama,Fog enabled distributed training architecture for federated learning, in:Big Data Analytics: 9th International Conference, BDA 2021, Proceed-ings, Springer-Verlag, Berlin, Heidelberg, 2021, pp. 78-92.

[8]

W. Zhang, X. Chen, K. He, L. Chen, L. Xu, X. Wang, S. Yang, Semi-asynchronous personalized federated learning for short-term photovoltaic power forecasting, Digit.commun. Netw. 9 (5) (2023) 1221-1229.

[9]

J. Nguyen, K. Malik, H. Zhan, A. Yousefpour, M. Rabbat, M. Malek, D. Huba, Fed- erated learning with buffered asynchronous aggregation, in: G. Camps-Valls, F.J.R. Ruiz, I. Valera (Eds.), Proceedings of the 25th International Conference on Artificial Intelligence and Statistics, in: Proceedings of Machine Learning Research, PMLR, vol. 151, 2022, pp. 3581-3607.

[10]

B. McMahan, E. Moore, D. Ramage, S. Hampson, B.A.y. Arcas,Communication-efficient learning of deep networks from decentralized data, in: A. Singh, J. Zhu (Eds.), Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, in: Proceedings of Machine Learning Research, PMLR, vol. 54, 2017, pp. 1273-1282.

[11]

C. Huang, J. Huang, X. Liu,Cross-silo federated learning: challenges and opportuni-ties, arXiv:2206. 12949, 2022.

[12]

D. Chen, D. Gao, Y. Xie, X. Pan, Z. Li, Y. Li, B. Ding, J. Zhou, Fs-real: towards real-world cross-device federated learning,in:Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Association for Computing Machinery, 2023, pp. 3829-3841.

[13]

T. Zhang, L. Gao, C. He, M. Zhang, B. Krishnamachari, S. Avestimehr,Federated learning for Internet of things: applications, challenges, and opportunities, arXiv: 2111. 07494, 2022.

[14]

A. Hard, K. Rao, R. Mathews, S. Ramaswamy, F. Beaufays, S. Augenstein, H. Eichner, C. Kiddon, D. Ramage, Federated learning for mobile keyboard prediction, arXiv: 1811.03604, 2019.

[15]

Z. Wang, Z. Zhang, Y. Tian, Q. Yang, H. Shan, W. Wang, T.Q.S. Quek, Asynchronous federated learning over wireless communication networks, IEEE Trans. Wirel.com-mun. 21 (9) (2022) 6961-6978.

[16]

Z. Chai, A. Ali, S. Zawad, S. Truex, A. Anwar, N. Baracaldo, Y. Zhou, H. Ludwig, F. Yan, Y. Cheng, Tifl: a tier-based federated learning system,in:Proceedings of the 29th International Symposium on High-Performance Parallel and Distributed Computing, HPDC ’20, Association for Computing Machinery, New York, NY, USA, 2020, pp. 125-136.

[17]

Z. Chai, Y. Chen, A. Anwar, L. Zhao, Y. Cheng, H. Rangwala, Fedat: a high-performance and communication-efficient federated learning system with asyn-chronous tiers,in:Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, SC ’21, Association for Computing Machinery, 2021, p. 16.

[18]

A. Reisizadeh, I. Tziotis, H. Hassani, A. Mokhtari, R. Pedarsani,Straggler-resilient federated learning: leveraging the interplay between statistical accuracy and system heterogeneity, IEEE J. Sel. Areas Inf. Theory 3 (2) (2022) 197-205.

[19]

J. Liu, J.H. Wang, C. Rong, Y. Xu, T. Yu, J. Wang, Fedpa: an adaptively partial model aggregation strategy in federated learning, Comput. Netw. 199 (2021) 108468.

[20]

K. Sultana, K. Ahmed, B. Gu, H. Wang, Elastic optimization for stragglers in edge federated learning, Big Data Min. Anal. 6 (4) (2023) 404-420.

[21]

Z. Ji, L. Chen, N. Zhao, Y. Chen, G. Wei, F.R. Yu, Computation offloading for edge-assisted federated learning, IEEE Trans. Veh. Technol. 70 (9) (2021) 9330-9344.

[22]

W. Zhang, S. Gupta, X. Lian, J. Liu, Staleness-aware async-sgd for distributed deep learning, in: Proceedings of the Twenty-Fifth International Joint Conference on Ar-tificial Intelligence, IJCAI’16, AAAI Press, 2016, pp. 2350-2356.

[23]

F. Zhu, J. Hao, Z. Chen, Y. Zhao, B. Chen, X. Tan, STAFL: staleness-tolerant asyn-chronous federated learning on non-iid dataset, Electronics 11 (3) (2022) 314.

[24]

J. Park, D.-J. Han, M. Choi, J. Moon, Sageflow: robust federated learning against both stragglers and adversaries,in: M. Ranzato, A. Beygelzimer, Y. Dauphin, P. Liang, J.W. Vaughan (Advances in Neural Information Processing Systems, Inc.,Eds.), vol. 34, Curran Associates, 2021, pp. 840-851.

[25]

M.G. Herabad, Communication-efficient semi-synchronous hierarchical federated learning with balanced training in heterogeneous iot edge environments, Int. Things 21 (2023) 100642.

[26]

C. Xu, Y. Qu, Y. Xiang, L. Gao, Asynchronous federated learning on heterogeneous devices: a survey, Comput. Sci. Rev. 50 (2023) 100595.

[27]

X. Li, Z. Qu, B. Tang, Z. Lu,Stragglers are not disasters: a hybrid federated learning framework with delayed gradients, in: 2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA), 2022, pp. 727-732.

[28]

X. Yu, L. Cherkasova, H. Vardhan, Q. Zhao, E. Ekaireb, X. Zhang, A. Mazumdar, T. Rosing, Async-hfl: efficient and robust asynchronous federated learning in hierar-chical iot networks,in:Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation, IoTDI ’23, Association for Computing Machin-ery, New York, NY, USA, 2023, pp. 236-248.

[29]

Q. Dai, T. Yan, P. Ren, Fedcsr: a new cluster sampling based on rotation mechanism in horizontal federated learning, Comput.commun. 210 (2023) 312-320.

[30]

M. Amadeo, C. Campolo, A. Molinaro, G. Ruggeri, G. Singh, Mitigating the com-munication straggler effect in federated learning via named data networking, IEEE Commun. Mag. 62 (11) (2024) 92-98.

[31]

M. Chen, B. Mao, T. Ma, Fedsa: a staleness-aware asynchronous federated learning algorithm with non-iid data, Future Gener. Comput. Syst. 120 (2021) 1-12.

[32]

Y. LeCun, C. Cortes, C. Burges, MNIST handwritten digit database, http://yann.lecun.com/exdb/mnist, 1998. (Accessed 2 December 2024).

AI Summary AI Mindmap
PDF

434

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/