login
Home / Papers / A Extra Federated Learning Results

A Extra Federated Learning Results

88 Citations•2022•
journal unavailable

The method consistently improves the base algorithms with various numbers of local training epochs and increases the total number of rounds to 500 to ensure convergence.

Abstract

Tuning the number of local epochs affects accuracy-communication trade-off for most federated learning algorithms. Prior studies [32,51,60] attempt to reduce the number of local training epochs to mitigate the disparity of local models. The default number of local training epochs is set as E = 10 in the main manuscript. We further test different numbers of local training epochs E = {1, 5} in Tab. 5. When E is set as 1, we increase the total number of rounds to 500 to ensure convergence. As is seen from Tab. 5, our method consistently improves the base algorithms with various numbers of local training epochs.