ProFe: Communication-Efficient Decentralized Federated Learning via Distillation and Prototypes
This paper introduces ProFe, a new algorithm designed to make Decentralized Federated Learning (DFL) more communication-efficient without compromising model performance. In DFL, clients collaborate without a central server, which avoids single-point failures but creates significant communication overhead—especially when nodes have… Read More »ProFe: Communication-Efficient Decentralized Federated Learning via Distillation and Prototypes


