About Me
PhD student working on computational aspects of differential, geometric, and algebraic structures (i.e., probability distributions and matrices). My research so far has mostly focused on geometric methods for approximate inference and numerical optimization in machine learning.
For natural-gradient (NG) methods, please see
- Structured NG descent for deep learning (ICML 2023): Paper, Code
- Structured NG descent (ICML 2021): Long Talk, Short Talk, Paper, Blog
- Riemannian gradient descent (ICML 2020): Talk, Paper, Code
- NG descent for exponential-family mixtures (ICML 2019): Paper, Code
- NG descent for Bayesian deep learninng (ICML 2018): Paper, Code
- NG variational inference for non-conjugate models (AI&Stats 2017): Paper, Code
For an introduction to NG methods, see my Blog.
For more publications, see my Google Scholar page.
I review papers from conferences (ICML, NeurIPS, ICLR, AI&Stats), journals (JMLR, Information Geometry (Springer)), and workshops (Optimization for Machine Learning, Advances in Approximate Bayesian Inference).
Research Interests
I am interested in exploiting (hidden) structures and symmetries in machine learning with a focus on practical and numerical methods for optimization and statistical inference.
News
Aug 31, 2021 | I held a reading group on geometric structures in machine learning at the UBC Machine Learning Reading Group. |
Jul 2, 2021 | New workshop paper on Structured second-order methods via natural gradient descent out. Will be presented at the Beyond first-order methods in ML systems Workshop at ICML2021, see the spotlight talk. |
Jun 30, 2021 | Tractable structured natural gradient descent using local parameterizations accepted at ICML2021! |
Jun 30, 2020 | Handling the positive-definite constraint in the bayesian learning rule accepted at ICML2020, see the ICML talk. |
Jun 30, 2019 | Fast and simple natural-gradient variational inference with mixture of exponential-family approximations accepted at ICML2019! |