Comparative Analysis of FedAvg and FedProx Algorithms in Federated Learning for Handwritten Character Recognition on the EMNIST Dataset

Authors

  • Xiao Ma School of Information Science and Engineering, East China University of Science and Technology, Shanghai, China

DOI:

https://doi.org/10.54097/h7srvr13

Keywords:

Federated learning, handwritten character recognition, deep learning.

Abstract

Federated Learning (FL) is a privacy-protecting way to train machine learning models on different devices. This study compares two popular FL algorithms, FedAvg and FedProx, using the EMNIST dataset, which is a more complex version of MNIST for recognizing handwritten letters. FedAvg is a simple and efficient algorithm that combines models trained on local devices. FedProx adds a term to help deal with non-IID data. The results show that both algorithms get similar global accuracy (94%) in an IID data setting. FedAvg has faster and more stable convergence, while FedProx has more ups and downs early on but becomes more stable later. These results show the strengths of both algorithms and suggest that each can work well in different federated learning situations. Overall, this comparison provides useful insights for designing future FL systems that balance efficiency, stability, and adaptability. Future research will test these algorithms in more complex non-IID conditions and with more clients.

Downloads

Download data is not yet available.

References

[1] McMahan HB, Moore E, Ramage D, Hampson S. Communication-efficient learning of deep networks from decentralized data. In: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS); 2017. p. 1273 – 1282.

[2] Li T, Sahu AK, Sanjabi M, Zhang H. Federated optimization in heterogeneous networks. In: Proceedings of the 23rd International Conference on Neural Information Processing Systems (NeurIPS); 2020. p. 1 – 12.

[3] Cohen G, Afshar S, Tapson J, van Schaik A. EMNIST: An extension of MNIST to handwritten letters. arXiv preprint arXiv: 1702.05373. 2017.

[4] McMahan B, Moore E, Ramage D, Hampson S, Arcas BA. Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics; 2017. p. 1273 – 1282. PMLR.

[5] Sun T, Li D, Wang B. Decentralized federated averaging. IEEE Trans Pattern Anal Mach Intell. 2022; 45 (4): 4289 – 301.

[6] Li T, Sahu AK, Zaheer M, Sanjabi M, Talwalkar A, Smith V. Federated optimization in heterogeneous networks. Proc Mach Learn Syst. 2020; 2: 429 – 50.

[7] Yuan X, Li P. On convergence of FedProx: Local dissimilarity invariant bounds, non-smoothness and beyond. Adv Neural Inf Process Syst. 2022; 35: 10752 – 65.

[8] Zhang Z. Improved Adam optimizer for deep neural networks. In: 2018 IEEE/ACM 26th International Symposium on Quality of Service (IWQoS); 2018 Jun 4. p. 1 – 2. IEEE.

[9] Bock S, Goppold J, Weiß M. An improvement of the convergence proof of the Adam-Optimizer. arXiv preprint arXiv: 1804.10587. 2018 Apr 27.

[10] Bock S, Weiß M. A proof of local convergence for the Adam optimizer. In: 2019 International Joint Conference on Neural Networks (IJCNN); 2019 Jul 14. p. 1 – 8. IEEE.

Downloads

Published

29-01-2026

Issue

Section

Articles

How to Cite

Ma, X. (2026). Comparative Analysis of FedAvg and FedProx Algorithms in Federated Learning for Handwritten Character Recognition on the EMNIST Dataset. Academic Journal of Science and Technology, 19(2), 501-506. https://doi.org/10.54097/h7srvr13