One-Shot Federated Learning
We present one-shot federated learning, where a central server learns a global model over a network of federated devices in a single round of communication. Our approach - drawing on ensemble learning and knowledge aggregation - achieves an average relative gain of 51.5 baselines and comes within 90.1 these methods and identify several promising directions of future work.
READ FULL TEXT