Sentence processing in spiking neurons: A biologically plausible left-corner parser

36th Annual Conference of the Cognitive Science Society, 2014

Terrence C. Stewart, Xuan Choo, Chris Eliasmith

Abstract

A long-standing challenge in cognitive science is how neurons could be capable of the flexible structured processing that is the hallmark of cognition. We present a spiking neural model that can be given an input sequence of words (a sentence) and produces a structured tree-like representation indicating the parts of speech it has identified and their relations to each other. While this system is based on a standard left-corner parser for constituency grammars, the neural nature of the model leads to new capabilities not seen in classical implementations. For example, the model gracefully decays in performance as the sentence structure gets larger. Unlike previous attempts at building neural parsing systems, this model is highly robust to neural damage, can be applied to any binary-constituency grammar, and requires relatively few neurons (~150,000).

Full text links

PDF

Conference Proceedings

Booktitle
36th Annual Conference of the Cognitive Science Society
Organization
Cognitive Science Society
Pages
1533--1538

Cite

Plain text

BibTeX