Jun 2022: Debugging using Orthogonal Descent, Workshop on Updatable Machine Learning, ICML 2022
Oct 2021: Submitted work on scaling laws of LTI systems to ICLR; Started work on non-parametric convolution augmented transformer; (update: iclr rejected)
Aug 2021: MASc Completed.
May 2021: ICML accepted! Started work at ABR!
April 2021: Submitted thesis.
Feb 2021: Submitted work to ICML.
Nov 2020: ICLR rejected...rip.
Sept 2020: Submitted work to ICLR.
May 2020: Started using the (modified) LMU for NLP applications.
March 2020: Covid!!
Jan 2020: Started LMU work.
Sept 2019: Started MASc in Systems Design Engineering.
April 2018: Obtained bachelors in Mathematical Physics from the University of Waterloo.
Conference and Workshop Papers
Debugging using Orthogonal Gradient Descent.
In Updatable Machine Learning, ICML. arXiv.
Parallelizing Legendre Memory Unit Training.
In Marina Meila and Tong Zhang, editors, Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, 1898–1907. PMLR.
Technical Reports and Preprints
Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers.