FASTER CONVERGENCE WITH LESS COMMUNICATION: BROADCAST-BASED SUBGRAPH SAMPLING FOR DECENTRALIZED LEARNING OVER WIRELESS NETWORKS

Faster Convergence With Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning Over Wireless Networks

Faster Convergence With Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning Over Wireless Networks

Blog Article

Decentralized stochastic gradient descent (D-SGD) is a widely adopted Dryer optimization algorithm for decentralized training of machine learning models across networked agents.A crucial part of D-SGD is the consensus-based model averaging, which heavily relies on information exchange and fusion among the nodes.For consensus averaging over wireless networks, due to the broadcast nature of wireless channels, simultaneous transmissions from multiple nodes may cause packet collisions if they share a common receiver.

Therefore, communication coordination is necessary to determine when and how a node can transmit (or receive) information to (or from) its neighbors.In this work, we propose BASS, a broadcast-based subgraph sampling method designed to accelerate the convergence of D-SGD while considering the actual communication cost per iteration.BASS creates a set of mixing matrix candidates that represent sparser subgraphs of the base topology.

In each consensus DVDs iteration, one mixing matrix is randomly sampled, leading to a specific scheduling decision that activates multiple collision-free subsets of nodes.The sampling occurs in a probabilistic manner, and the elements of the mixing matrices, along with their sampling probabilities, are jointly optimized.Simulation results demonstrate that BASS achieves faster convergence and requires fewer transmission slots than existing link-based scheduling methods and the full communication scenario.

In conclusion, the inherent broadcasting nature of wireless channels offers intrinsic advantages in accelerating the convergence of decentralized optimization and learning.

Report this page