Spiking Neural Computing in Memristive Neuromorphic Platforms
Résumé
Abstract Neuromorphic computation using Spiking Neural Networks (SNN) is pro-
posed as an alternative solution for future of computation to conquer the memory
bottelneck issue in recent computer architecture. Different spike codings have been
discussed to improve data transferring and data processing in neuro-inspired compu-
tation paradigms. Choosing the appropriate neural network topology could result in
better performance of computation, recognition and classification. The model of the
neuron is another important factor to design and implement SNN systems. The speed
of simulation and implementation, ability of integration to the other elements of the
network, and suitability for scalable networks are the factors to select a neuron model.
The learning algorithms are significant consideration to train the neural network for
weight modification. Improving learning in neuromorphic architecture is feasible
by improving the quality of artificial synapse as well as learning algorithm such as
STDP. In this chapter we proposed a new synapse box that can remember and forget.
Furthermore, as the most frequent used unsupervised method for network training in
SNN is STDP, we analyze and review the various methods of STDP. The sequential
order of pre- or postsynaptic spikes occurring across a synapse in an interval of time
leads to defining different STDP methods. Based on the importance of stability as
well as Hebbian competition or anti-Hebbian competition the method will be used
in weight modification. We survey the most significant projects that cause making
neuromorphic platform. The advantages and disadvantages of each neuromorphic
platform are introduced in this chapter.
Origine : Fichiers produits par l'(les) auteur(s)