PhD student working on computational aspects of differential, geometric, and algebraic structures (i.e., probability distributions and matrices). My research so far has mostly focused on geometric methods for approximate inference and numerical optimization in machine learning.
For natural-gradient (NG) methods, please see
- Structured NG descent for deep learning (Arxiv 2023): Paper
- Structured NG descent (ICML 2021): Long Talk, Short Talk, Paper, Blog
- Riemannian gradient descent (ICML 2020): Talk, Paper
- NG descent for exponential-family mixtures (ICML 2019): Paper
- NG descent for Bayesian deep learninng (ICML 2018): Paper
- NG variational inference for non-conjugate models (AI&Stats 2017): Paper
For an introduction to NG methods, see my Blog.
For more publications, see my Google Scholar page.
I review papers from conferences (ICML, NeurIPS, ICLR, AI&Stats), journals (JMLR, Information Geometry (Springer)), and workshops (Optimization for Machine Learning, Advances in Approximate Bayesian Inference).
I am interested in exploiting (hidden) structures and symmetries in machine learning with a focus on practical and numerical methods for optimization and statistical inference.
|Aug 31, 2021||I held a reading group on geometric structures in machine learning at the UBC Machine Learning Reading Group.|
|Jul 2, 2021||New workshop paper on Structured second-order methods via natural gradient descent out. Will be presented at the Beyond first-order methods in ML systems Workshop at ICML2021, see the spotlight talk.|
|Jun 30, 2021||Tractable structured natural gradient descent using local parameterizations accepted at ICML2021!|
|Jun 30, 2020||Handling the positive-definite constraint in the bayesian learning rule accepted at ICML2020, see the ICML talk.|
|Jun 30, 2019||Fast and simple natural-gradient variational inference with mixture of exponential-family approximations accepted at ICML2019!|