Skip to yearly menu bar Skip to main content


Spotlight

Stochastic Controlled Averaging for Federated Learning with Communication Compression

Xinmeng Huang · Ping Li · Xiaoyun Li

Abstract:

Communication compression has gained great interest in Federated Learning(FL) for the potential of alleviating its communication overhead. However, com-munication compression brings forth new challenges in FL due to the interplayof compression-incurred information distortion and inherent characteristics of FLsuch as partial participation and data heterogeneity. Despite the recent develop-ment, the existing approaches either cannot accommodate arbitrary data hetero-geneity or partial participation, or require stringent conditions on compression.In this paper, we revisit the seminal stochastic controlled averaging method byproposing an equivalent but more efficient/simplified formulation with halved up-link communication costs, building upon which we propose two compressed FLalgorithms, SCALLION and SCAFCOM, to support unbiased and biased com-pression, respectively. Both the proposed methods outperform the existing com-pressed FL methods in terms of communication and computation complexities.Moreover, SCALLION and SCAFCOM attain fast convergence rates under ar-bitrary data heterogeneity and without any additional assumptions on compressionerrors. Experiments show that SCALLION and SCAFCOM outperform recentcompressed FL methods under the same communication budget.

Chat is not available.