Visual motion processing and perceptual decision making

35th Annual Conference of the Cognitive Science Society, 2013

Aziz Hurzook, Oliver Trujillo, Chris Eliasmith


We present a novel, biologically plausible model of visual motion processing and perceptual decision making that is independent of the number of choice categories or alternatives. The implementation is a large-scale spiking neural circuit consisting of: 1) a velocity filter using the principle of oscillator interference to determine the direction and speed of pattern motion in V1; 2) a representation of motion evidence in the middle temporal area (MT); and 3) integration of sensory evidence over time by a higher-dimensional attractor network in the lateral intraparietal area (LIP). We demonstrate the model by reproducing behavioral and neural results from classic perceptual decision making experiments that test the perceived direction of motion of variable coherence dot kinetograms. Specifically, these results capture monkey data from two-alternative forced-choice motion decision tests. We note that without any reconfiguration of the circuit, the implementation can be used to make decisions among a continuum of alternatives.

Full text links


Supplementary information


Conference Proceedings

35th Annual Conference of the Cognitive Science Society
Cognitive Science Society


Plain text