Improving Spiking Dynamical Networks: Accurate Delays, Higher-Order Synapses, and Time Cells

Neural Computation, 2018

Aaron R. Voelker, Chris Eliasmith

Abstract

Researchers building spiking neural networks face the challenge of improving the biological plausibility of their model networks while maintaining the ability to quantitatively characterize network behavior. In this work, we extend the theory behind the neural engineering framework (NEF), a method of building spiking dynamical networks, to permit the use of a broad class of synapse models while maintaining prescribed dynamics up to a given order. This theory improves our understanding of how low-level synaptic properties alter the accuracy of high-level computations in spiking dynamical networks. For completeness, we provide characterizations for both continuous-time (i.e., analog) and discrete-time (i.e., digital) simulations. We demonstrate the utility of these extensions by mapping an optimal delay line onto various spiking dynamical networks using higher-order models of the synapse. We show that these networks nonlinearly encode rolling windows of input history, using a scale invariant representation, with accuracy depending on the frequency content of the input signal. Finally, we reveal that these methods provide a novel explanation of time cell responses during a delay task, which have been observed throughout hippocampus, striatum, and cortex.

Full text links

PDF

External link

Journal Article

Publisher
MIT Press
Doi
10.1162/neco_a_01046
Number
3
Month
03
Volume
30
Pages
569-609
Journal
Neural Computation

Cite

Plain text

BibTeX