|

Backpropagation-free training of deep physical neural networks.

Researchers

Journal

Modalities

Models

Abstract

Recent successes in deep learning for vision and natural language processing are attributed to larger models but come with energy consumption and scalability issues. Current training of digital deep learning models primarily relies on backpropagation that is unsuitable for physical implementation. Here, we proposed a simple deep neural network architecture augmented by a physical local learning (PhyLL) algorithm, enabling supervised and unsupervised training of deep physical neural networks, without detailed knowledge of the nonlinear physical layer’s properties. We trained diverse wave-based physical neural networks in vowel and image classification experiments, showcasing our approach’s universality. Our method shows advantages over other hardware-aware training schemes by improving training speed, enhancing robustness, and reducing power consumption by eliminating the need for system modelling and thus decreasing digital computation.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *