My research interests are mostly focused on extending the
Neural Engineering to work with more detailed neuron models, and using adaptive filtering as a
model of sensory inference in the brain.
Regarding my work on more detailed neuron models, I have been working on incorporating simple
multi-compartment models with conductance-based synapses into the NEF. The more complex interaction
between excitation and inhibition can be systematically exploited to compute functions such as
multiplication. Most of this work has been implemented in NengoBio,
an extension library for Nengo facilitating the construction of biologically constrained models.
I have adapted some of this work to construct a model of eyeblink conditioning the cerebellum.
Regarding my work on adaptive filtering, I am focusing on predictive neural networks based on adaptive state observers.
Apart from these focus areas, I am interested in a broad range of topics including–but not limited to
–neuromorphic hardware, neurorobotics, and unsupervised learning strategies.
I defended my PhD thesis in December 2021; please find my thesis here.
I have been teaching SYDE 556/750 "Simulating Neurobiological Systems" in Winter 2020.
See here for the lecture notes and slides I prepared for the course.
Prior to joining the CNRG in January 2017, I have been a research assistant at the
Cognitronics and Sensor Systems Group in Bielefeld,
Germany, where I worked on neuromorphic hardware benchmarks and associative memory as part of the
HBP neuromorphics subproject.
In my free time, I am either reading, building free software, tinkering with electronics,
writing terrible German fantasy novels, or wandering around aimlessly, getting lost.
Publications
Theses
Journal Articles
-
Nicole Sandra-Yaffa Dumont,
Andreas Stöckel,
P. Michael Furlong,
Madeleine Bartlett,
Chris Eliasmith,
Terrence C. Stewart
(2023)
Biologically-Based Computation: How Neural Details and Dynamics Are Suited for Implementing a Variety of Algorithms.
Brain Sciences, 13(2):245.
Abstract
PDF
DOI
External link
-
Andreas Stöckel,
Chris Eliasmith
(2022)
Computational properties of multi-compartment LIF neurons with passive dendrites.
Neuromorphic Computing and Engineering.
Abstract
DOI
-
Andreas Stöckel,
Terrence C. Stewart,
Chris Eliasmith
(2021)
Connecting Biological Detail with Neural Computation: Application to the Cerebellar Granule-Golgi Microcircuit.
Topics in Cognitive Science, 13(3):515-533.
Abstract
PDF
DOI
-
Andreas Stöckel,
Chris Eliasmith
(2021)
Passive Nonlinear Dendritic Interactions as a Computational Resource in Spiking Neural Networks.
Neural Computation, 33(1):96-128. PMID: 33080158.
Abstract
DOI
-
Andreas Stöckel,
Christoph Jenzen,
Michael Thies,
Ulrich Rückert
(2017)
Binary Associative Memories as a Benchmark for Spiking Neuromorphic Hardware.
Frontiers in Computational Neuroscience, 11:71.
Abstract
DOI
External link
Conference and Workshop Papers
-
Andreas Stöckel,
Terrence C. Stewart,
Chris Eliasmith
(2020)
A Biologically Plausible Spiking Neural Model of Eyeblink Conditioning in the Cerebellum.
In 42nd Annual Meeting of the Cognitive Science Society, 1614–1620. Toronto, ON. Cognitive Science Society.
Abstract
PDF
Poster
-
Andreas Stöckel,
Terrence C. Stewart,
Chris Eliasmith
(2020)
Connecting Biological Detail with Neural Computation: Application to the Cerebellar Granule-Golgi Microcircuit.
In 18th Annual Meeting of the International Conference on Cognitive Modelling, 277–282. Toronto, ON. Society for Mathematical Psychology.
Abstract
PDF
-
Andreas Stöckel,
Terrence C. Stewart,
Chris Eliasmith
(2019)
A Geometric Interpretation of Feedback Alignment.
In 41nd Annual Meeting of the Cognitive Science Society, 3366. Montreal, QC. Cognitive Science Society. Abstract.
Abstract
PDF
Poster
-
Nolan P. Shaw,
Andreas Stöckel,
Ryan Orr,
Thomas F. Lidbetter,
Robin Cohen
(2018)
Towards Provably Moral AI Agents in Bottom-up Learning Frameworks.
In AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society. New Orleans, USA.
Abstract
PDF
External link
-
Andreas Stöckel,
Aaron R. Voelker,
Chris Eliasmith
(2018)
Nonlinear synaptic interaction as a computational resource in the Neural Engineering Framework.
In Cosyne Abstracts. Denver USA.
Abstract
PDF
Poster
External link
Technical Reports and Preprints
-
P. Michael Furlong,
Andreas Stöckel,
Terrence C. Stewart,
Chris Eliasmith
(2022)
Learned Legendre Predictor: Learning with Compressed Representations for Efficient Online Multistep Prediction.
Technical Report, Centre for Theoretical Neuroscience.
Abstract
PDF
External link
-
Andreas Stöckel
(2021)
Discrete Function Bases and Convolutional Neural Networks.
arXiv preprint arXiv:2103.05609.
Abstract
PDF
arXiv
-
Andreas Stöckel
(2021)
Constructing Dampened LTI Systems Generating Polynomial Bases.
arXiv preprint arXiv:2103.00051.
Abstract
PDF
arXiv
-
Andreas Stöckel
(2020)
Assorted Notes on Radial Basis Functions.
Technical Report, Centre for Theoretical Neuroscience, Waterloo, ON.
Abstract
PDF
DOI
External link
-
Andreas Stöckel
(2017)
Finding Tuning Curves for Point Neurons with Conductance-Based Synapses.
Technical Report, Centre for Theoretical Neuroscience, Waterloo, ON.
Abstract
PDF
DOI
External link
-
Andreas Stöckel,
Aaron R. Voelker,
Chris Eliasmith
(2017)
Point Neurons with Conductance-Based Synapses in the Neural Engineering Framework.
arXiv preprint arXiv:1710.07659.
Abstract
PDF
arXiv