Indexed by:
Abstract:
The biologically discovered intrinsic plasticity (IP) learning rule, which changes the intrinsic excitability of an individual neuron by adaptively turning the firing threshold, has been shown to be crucial for efficient information processing. However, this learning rule needs extra time for updating operations at each step, causing extra energy consumption and reducing the computational efficiency. The event-driven or spike-based coding strategy of spiking neural networks (SNNs), i.e., neurons will only be active if driven by continuous spiking trains, employs all-or-none pulses (spikes) to transmit information, contributing to sparseness in neuron activations. In this article, we propose two event-driven IP learning rules, namely, input-driven and self-driven IP, based on basic IP learning. Input-driven means that IP updating occurs only when the neuron receives spiking inputs from its presynaptic neurons, whereas self-driven means that IP updating only occurs when the neuron generates a spike. A spiking convolutional neural network (SCNN) is developed based on the ANN2SNN conversion method, i.e., converting a well-trained rate-based artificial neural network to an SNN via directly mapping the connection weights. By comparing the computational performance of SCNNs with different IP rules on the recognition of MNIST, FashionMNIST, Cifar10, and SVHN datasets, we demonstrate that the two event-based IP rules can remarkably reduce IP updating operations, contributing to sparse computations and accelerating the recognition process. This work may give insights into the modeling of brain-inspired SNNs for low-power applications.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN: 2162-237X
Year: 2021
Issue: 5
Volume: 33
Page: 1986-1995
1 4 . 2 5 5
JCR@2021
1 0 . 2 0 0
JCR@2023
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:106
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 30
SCOPUS Cited Count: 26
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: