Advanced Neuromorphic Computing with Single-Transitor-Based Electronic Neurons and Synapses
Please login to view abstract download link
Hardware implementations of artificial neural networks (ANNs)—the most advanced of which are made of millions of electronic neurons interconnected by hundreds of millions of electronic synapses—have achieved higher energy efficiency than classical computers in some small-scale data-intensive computing tasks. State-of-the-art neuromorphic computers, such as Intel’s Loihi or IBM’s NorthPole, implement ANNs using bio-inspired neuron- and synapse-mimicking circuits made of complementary metal–oxide–semiconductor (CMOS) transistors, at least 18 per neuron and six per synapse. Simplifying the structure and size of these two building blocks would enable the construction of more sophisticated, larger and more energy-efficient ANNs. In this talk I will explain how a single CMOS transistor can exhibit neural and synaptic behaviours if it is biased in a specific (unconventional) manner. By connecting one additional CMOS transistor in series, we build a versatile 2-transistor-cell that exhibits adjustable neuro-synaptic response (which we named neuro-synaptic random access memory cell, or NS-RAM cell). This electronic performance comes with a yield of 100% and an ultra-low device-to-device variability, owing to the maturity of the silicon CMOS platform used—no materials or devices alien to the CMOS process are required. These results represent a short-term solution for the implementation of efficient ANNs and an opportunity in terms of CMOS circuit design and optimization for artificial intelligence applications.