Adjusting Learning Rate of Memristor-Based Multilayer Neural Networks via Fuzzy Method

Publication Type:
Journal Article
Citation:
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2019, 38 (6), pp. 1084 - 1094
Issue Date:
2019-06-01
Full metadata record
© 1982-2012 IEEE. Back propagation (BP) based on stochastic gradient descent is the prevailing method to train multilayer neural networks (MNNs) with hidden layers. However, the existence of the physical separation between memory arrays and arithmetic module makes it inefficient and ineffective to implement BP in conventional digital hardware. Although CMOS may alleviate some problems of the hardware implementation of MNNs, synapses based on CMOS cost too much power and areas in very large scale integrated circuits. As a novel device, memristor shows promises to overcome this shortcoming due to its ability to closely integrate processing and memory. This paper proposes a novel circuit for implementing a synapse based on a memristor and two MOSFET tansistors (p-type and n-type). Compared with a CMOS-only circuit, the proposed one reduced the area consumption by 92%-98%. In addition, we develop a fuzzy method for the adjustment of the learning rates of MNNs, which increases the learning accuracy by 2%-3% compared with a constant learning rate. Meanwhile, the fuzzy adjustment method is robust and insensitive to parameter changes due to the approximate reasoning. Furthermore, the proposed methods can be extended to memristor-based multilayer convolutional neural network for complex tasks. The novel architecture behaves in a human-liking thinking process.
Please use this identifier to cite or link to this item: