Real-time Computing without stable states: A New Framework for Neural Computation Based on Perturbations

TitleReal-time Computing without stable states: A New Framework for Neural Computation Based on Perturbations
Publication TypeJournal Article
Year of Publication2002
AuthorsMaass, W., T. Natschlaeger, and H. Markram
JournalNeural Computation
Volume14
Issue11
Pagination2531–2560
KeywordsLSM
Abstract

A key challenge for neural modeling is to explain how a continuous stream of multi-modal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real-time. We propose a new framework for neural computation that provides an alternative to previous approaches based on attractor neural networks. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a neural circuit may serve as a universal source of information about past stimuli, from which readout neurons can extract particular aspects needed for diverse tasks in real-time. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that unlike Turing machines, does not require sequential transitions between discreteNatschlager:04 internal states. Like the Turing machine paradigm it allows for universal computational power under idealized conditions, but for real-time processing of time-varying input. The resulting new framework for neural computation has novel implications for the interpretation of neural coding, for the design of experiments and data-analysis in neurophysiology, and for neuromorphic engineering.