About Me

Postdoctoral fellow working on computational aspects of differential, geometric, and algebraic structures (i.e., probability distributions and matrices). My research so far has mostly focused on geometric methods for numerical optimization and approximate inference in machine learning.

For natural-gradient (NG) methods, please see

  • Adaptive gradient methods as NGD (Arxiv 2024): Paper
  • Structured NG descent for deep learning (ICML 2023): Paper, Code
  • Structured NG descent (ICML 2021):Paper, Blog
  • Riemannian gradient descent (ICML 2020): Paper, Code
  • NG descent for exponential-family mixtures (ICML 2019): Paper, Code
  • NG descent for Bayesian deep learninng (ICML 2018): Paper, Code
  • NG variational inference for non-conjugate models (AI&Stats 2017): Paper, Code

For an introduction to NG methods, see my Blog Posts.

For more publications, see my Google Scholar page.

Research Interests

I am interested in exploiting (hidden) structures and symmetries in machine learning with a focus on practical and numerical methods for optimization and statistical inference.