Federated optimization via knowledge codistillation
作者:
Highlights:
• A federated optimization framework based on knowledge codistillation is proposed.
• An extension is presented to hold a personalized model for each federated device.
• Theoretical convergence guarantees for our algorithms are provided.
• Performance of the proposed schemes is evaluated on diverse federated benchmarks.
摘要
•A federated optimization framework based on knowledge codistillation is proposed.•An extension is presented to hold a personalized model for each federated device.•Theoretical convergence guarantees for our algorithms are provided.•Performance of the proposed schemes is evaluated on diverse federated benchmarks.
论文关键词:Federated learning,Distributed computing,Federated optimization,Knowledge distillation,Non-IID data
论文评审过程:Received 17 April 2021, Revised 23 November 2021, Accepted 25 November 2021, Available online 11 December 2021, Version of Record 21 December 2021.
论文官网地址:https://doi.org/10.1016/j.eswa.2021.116310