An ultra-compact leaky-integrate-and-fire model for building spiking neural networks - CentraleSupélec Accéder directement au contenu
Article Dans Une Revue Scientific Reports Année : 2019

An ultra-compact leaky-integrate-and-fire model for building spiking neural networks

Résumé

We introduce an ultra-compact electronic circuit that realizes the leaky-integrate-and-fire model of artificial neurons. Our circuit has only three active devices, two transistors and a silicon controlled rectifier (SCR). We demonstrate the implementation of biologically realistic features, such as spike-frequency adaptation, a refractory period and voltage modulation of spiking rate. All characteristic times can be controlled by the resistive parameters of the circuit. We built the circuit with out-of-the-shelf components and demonstrate that our ultra-compact neuron is a modular block that can be associated to build multi-layer deep neural networks. We also argue that our circuit has low power requirements, as it is normally off except during spike generation. Finally, we discuss the ultimate ultra-compact limit, which may be achieved by further replacing the SCR circuit with Mott materials. We are currently witnessing an ongoing technological revolution. The longstanding promise of artificial intelligent systems realized in neural networks is beginning to materialize 1. Significant milestones have been overcome such as, for instance, the deep neural network algorithm AlphaGo beating the world champion of the board game go. A neural network is a system of interconnected units, which is inspired by the mammalian brain. The units, called neurons, perform a simple basic non-linear process, and their inter-connections are called synapses 2. Neural network systems are implemented by either running software on a conventional (super) computer, as AlphaGo 3 , or directly in hardware by dedicated integrated CMOS (VLSI) circuits 4. A notable example of the latter is the chip TrueNorth, whose circuits emulate both, synaptic and neuronal functionalities 5. However, both strategies suffer from significant bottlenecks to achieve the massive scale needed to compete with a mammalian brain. It is often quoted the amazing power efficiency of the human brain, which counts 10 11 neurons and 10 15 synapses and requires just about 20 W to function. In contrast, running AlphaGo on a digital supercomputer requires the order of hundreds of kWs. Nevertheless, conventional electronics is not to be blamed for lack of efficiency, as the last generation of microprocessors in modern digital computers and smartphones can integrate 10 10 transistors and consume less than 10 W. Moreover, the brain-inspired chip TrueNorth counts 5.4 × 10 9 transistors and consumes less than 0.07 W 5. While this is impressive, the implementation of a circuit that emulates neuronal function currently requires a large number of transistors. In TrueNorth, each of its 4096 cores has 1.2 million transistors that implement 256 neurons. Hence, a neuron requires about 10 4 transistors. This indicates that there is a need to explore ways of building efficient neuromorphic circuits with a significant reduction in the number of components. Such compact neuron models have been proposed, which typically require tens of transistors 6,7. Here, we present a significant improvement along this direction and introduce an ultra-compact neuron model that brings the count of active devices down to three, two transistors and a silicon controlled rectifier (SCR), also called thyristor. We identify the non-linear I-V characteristics and the gate of the SCR as the key features which enable a simple implementation of an electronic neuron with the leaky-integrate-and-fire (LIF) model functionality 8 .
Fichier principal
Vignette du fichier
s41598-019-47348-5.pdf (1.46 Mo) Télécharger le fichier
Origine : Publication financée par une institution
Loading...

Dates et versions

hal-02268698 , version 1 (11-03-2020)

Identifiants

Citer

M. Rozenberg, O. Schneegans, P. Stoliar. An ultra-compact leaky-integrate-and-fire model for building spiking neural networks. Scientific Reports, 2019, 9, ⟨10.1038/s41598-019-47348-5⟩. ⟨hal-02268698⟩
73 Consultations
288 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More