This paper introduces DRACO, a decentralized and asynchronous federated learning framework tailored for wireless networks where communication delays, node heterogeneity, and topology variations make traditional synchronous FL inefficient. Unlike centralized approaches, DRACO removes the need for a server and allows devices to update the global model collaboratively through peer-to-peer exchanges. A central challenge in decentralized FL is the misalignment caused by asynchronous updates, clients operate at different speeds, send updates irregularly, and may rely on outdated neighbor information. To address this, DRACO organizes communication through row-stochastic mixing matrices, ensuring that each device weights the information it receives in a way that preserves global consensus over time. The system model shown in Figure 1 illustrates devices connected through a wireless graph, exchanging parameters opportunistically based on channel availability.
DRACO further incorporates gradient steps, mixing steps, and local buffering so that each device can continue training even when neighbors are temporarily unreachable. The algorithm overview in Figure 2 highlights how devices asynchronously fetch and integrate neighbor information to update their parameters without global coordination. The paper provides a convergence analysis demonstrating that DRACO achieves stable learning under realistic wireless impairments such as random delays and asynchronous communication patterns. Experimental results included in the paper show improved training efficiency compared with synchronous decentralized methods, especially when devices exhibit heterogeneous computation and communication capabilities. Overall, the study positions DRACO as a robust solution for future distributed learning systems deployed over 6G-class wireless networks. By combining asynchronous operation with row-stochastic averaging, DRACO enables scalable, serverless federated learning in environments where communication is intermittent and devices operate independently.
DRACO Decentralized Asynchronous Federated Learning over Row-Stochastic Wireless Networks