|

Weakly correlated synapses promote dimension reduction in deep neural networks.

Researchers

Journal

Modalities

Models

Abstract

By controlling synaptic and neural correlations, deep learning has achieved empirical successes in improving classification performances. How synaptic correlations affect neural correlations to produce disentangled hidden representations remains elusive. Here we propose a simplified model of dimension reduction, taking into account pairwise correlations among synapses, to reveal the mechanism underlying how the synaptic correlations affect dimension reduction. Our theory determines the scaling of synaptic correlations requiring only mathematical self-consistency for both binary and continuous synapses. The theory also predicts that weakly correlated synapses encourage dimension reduction compared to their orthogonal counterparts. In addition, these synapses attenuate the decorrelation process along the network depth. These two computational roles are explained by a proposed mean-field equation. The theoretical predictions are in excellent agreement with numerical simulations, and the key features are also captured by deep learning with Hebbian rules.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *