A Neurally Plausible Encoding of Word Order Information into a Semantic Vector Space

35th Annual Conference of the Cognitive Science Society, 2013

Peter Blouw, Chris Eliasmith

Abstract

Distributed models of lexical semantics increasingly incorporate information about word order. One influential method for encoding this information into high-dimensional spaces uses convolution to bind together vectors to form representations of numerous n-grams that a target word is a part of. The computational complexity of this method has led to the development of an alternative that uses random permutation to perform order-sensitive vector combinations. We describe a simplified form of order encoding with convolution that yields comparable performance to earlier models, and we discuss considerations of neural implementation that favor the use of the proposed encoding. We conclude that this new encoding method is a more neurally plausible alternative than its predecessors.

Full text links

 PDF

Supplementary information

 Poster

Conference Proceedings

Booktitle
35th Annual Conference of the Cognitive Science Society
Pages
1905–1910

Cite

Plain text

BibTeX