Dominating Set Model Aggregation for communication-efficient decentralized deep learning.

Researchers

Journal

Modalities

Models

Abstract

Decentralized deep learning algorithms leverage peer-to-peer communication of model parameters and/or gradients over communication graphs among the learning agents with access to their private data sets. The majority of the studies in this area focus on achieving high accuracy, with many at the expense of increased communication overhead among the agents. However, large peer-to-peer communication overhead often becomes a practical challenge, especially in harsh environments such as for an underwater sensor network. In this paper, we aim to reduce communication overhead while achieving similar performance as the state-of-the-art algorithms. To achieve this, we use the concept of Minimum Connected Dominating Set from graph theory that is applied in ad hoc wireless networks to address communication overhead issues. Specifically, we propose a new decentralized deep learning algorithm called minimum connected Dominating Set Model Aggregation (DSMA). We investigate the efficacy of our method for different communication graph topologies with a small to large number of agents using varied neural network model architectures. Empirical results on benchmark data sets show a significant (up to 100X) reduction in communication time while preserving the accuracy or in some cases, increasing it compared to the state-of-the-art methods. We also present an analysis to show the convergence of our proposed algorithm.Copyright © 2023 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *