Assorted Notes on Radial Basis Functions

Tech Report, 2020

Andreas Stöckel

Abstract

We discuss a minimal, unconstrained log-Cholesky parametrisation of radial basis functions (RBFs) and the corresponding partial derivatives. This is useful when using RBFs as part of neural network that is either trained in a supervised fashion via error backpropagation, or unsupervised using a homeostasis mechanism. We perform some experiments and discuss potential caveats when using RBFs in this way. Furthermore, we compare RBFs to the Spatial Semantic Pointer similarity that can be used to construct networks with sparse hidden representations resembling those found in RBF networks.

Full text links

 PDF

 External link

 DOI

CTN Tech Report

Month
11
Institution
Centre for Theoretical Neuroscience
Address
Waterloo, ON
Issn
CTN-TR-20201129-016
Doi
10.13140/RG.2.2.27177.62563/1

Cite

Plain text

BibTeX