Sign backpropagation: An on-chip learning algorithm for analog RRAM neuromorphic computing systems.

Researchers

Journal

Modalities

Models

Abstract

Currently, powerful deep learning models usually require significant resources in the form of processors and memory, which leads to very high energy consumption. The emerging resistive random access memory (RRAM) has shown great potential for constructing a scalable and energy-efficient neural network. However, it is hard to port a high-precision neural network from conventional digital CMOS hardware systems to analog RRAM systems owing to the variability of RRAM devices. A suitable on-chip learning algorithm should be developed to retrain or improve the performance of the neural network. In addition, determining how to integrate the periphery digital computations and analog RRAM crossbar is still a challenge. Here, we propose an on-chip learning algorithm, named sign backpropagation (SBP), for RRAM-based multilayer perceptron (MLP) with binary interfaces (0, 1) in forward process and 2-bit (Ā±1, 0) in backward process. The simulation results show that the proposed method and architecture can achieve a comparable classification accuracy with MLP on MNIST dataset, meanwhile it can save area and energy cost by the calculation and storing of the intermediate results and take advantages of the RRAM crossbar potential in neuromorphic computing.
Copyright Ā© 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *