The Improvement of Federated Learning in communication efficiency

Authors

  • Yimeng Yang

DOI:

https://doi.org/10.54097/hset.v34i.5446

Keywords:

Federated learning; communication efficiency; local computation; communication compression

Abstract

In recent years, the need to protect user privacy and data security as well as multi-party data cooperation has become increasingly strong. Privacy computing is considered a fundamental solution that is growing rapidly and gaining unprecedented popularity. Currently, federated learning is the most popular product in privacy computing. With its lightweight technology route and deployment, it has become a mainstream solution and product of choice for many privacy computing applications, such as providing advanced medical image collection segmentation technology in the medical field to protect patient privacy, where it can play a unique role. But Federated learning faces a need to improve communication efficiency. This paper introduces FedAvg, the most basic algorithm for federated learning, including its basic process, advantages and disadvantages. We analyze algorithms that optimize federated learning by adding local computation, Communication Compression (including model Compression, Periodic Averaging and Quantization Compression, and gradient Compression), computation and communication overlap, and compare the strengths and weaknesses of representative algorithms for each approach. At the end of this paper, the summary and prospect are made.

Downloads

Download data is not yet available.

References

Liu Zaiyi, Shi Zhenwei, Liang Changhong. Application of federated learning technology in medical image artificial intelligence. 2021, Chin J Med,102(05):318-320.

Zhou Chuanxin, Sun Yi, Wang Degang, Ge Huanhui. 2021 J. net. Inf. Sec.,7(05):77-92.

Liang Tiankai, Zeng Bi, Chen Guang. Federal study reviews: concepts, technology, application and challenges. 2022, Compute. Appl., 1-13.

Qiu Xinyuan, Ye Zecong, Cui Teilong, Gao Zhiqiang. A survey on the Communication Cost of Federated Learning. 2021, J. Compute. Appl., 222,42(02):333-342.

Wang Zhuangzhuang, Chen Hongsong, Yang Limin, Chen Lifang. Federated learning and data security. 2021, Intel. Compute. Appl.,11(01):126-129+133.

Wang Kunqing, Liu Jing, Li Chen, Zhao Yuhang, Lu Haoran, Li Peng, Liu Bingying. A review of Federated Learning security threats.2020, Inf. Sec. Res., 222,8(03):223-234.

Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, Blaise Agüera y Arcas: Communication-Efficient Learning of Deep Networks from Decentralized Data. 2017, AISTATS: 1273-1282

Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, Virginia Smith: Federated Optimization in Heterogeneous Networks. 2020 ML Sys 71 36

Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank J. Reddi, Sebastian U. Stich, Ananda Theertha Suresh: SCAFFOLD: Stochastic Controlled Averaging for Federated Learning. 2020, ICML: 5132-5143.

Qinbin Li, Bingsheng He, Dawn Song: Model-Contrastive Federated Learning. Proceedings of the 2021, Conf. Compute. Vis. Pat. Rec. 10713-10722

Ting Chen, Simon Kornblith, Mohammad Norouzi, Geoffrey E. Hinton: A Simple Framework for Contrastive Learning of Visual Representations. 2020 ICML: 1597-1607

Yanci Zhang, Han Yu: Towards Verifiable Federated Learning. 2022, IJCAI: 5686-5693

Sebastian Caldas, Jakub Konecný, H. Brendan McMahan, Ameet Talwalkar. Expanding the Reach of Federated Learning by Reducing Client Resource Requirements. 2018, CoRR abs/1812.07210

Rothchild, D., Panda, A., Ullah, E., Ivkin, N., Stoica, I., Braverman, V., et al. FetchSGD: communication-efficient federated learning with sketching, 2020, ICML 891

Wang Jian-zong, Kong Ling-wei, Huang Zhang-cheng. A survey of federated learning algorithms. 2020 Big data, 6(06):64-82.

Hongchang Gao, An Xu, Heng Huang: On the Convergence of Communication-Efficient Local SGD for Federated Learning. 2021 AAAI: 7510-751

Hao Zhang, Zeyu Zheng, Shizhen Xu, Wei Dai, Poseidon: An Efficient Communication Architecture for Distributed Deep Learning on GPU Clusters.2017 CoRR abs/1706.03292

Shaohuai Shi, Xiaowen Chu, MG-WFBP: Efficient Data Communication for Distributed Synchronous SGD Algorithms. 2018 CoRR abs/1811.11141

Downloads

Published

28-02-2023

How to Cite

Yang, Y. (2023). The Improvement of Federated Learning in communication efficiency. Highlights in Science, Engineering and Technology, 34, 183-190. https://doi.org/10.54097/hset.v34i.5446