BIDSA Hybrid Seminar: "Two applications of the variational form of Bayes theorem”
“Two applications of the variational form of Bayes theorem”
Haavard Rue (King Abdullah University of Science and Technology - KAUST)
December 15, 2021 | 12pm CET | Hybrid mode: 3-E4-SR03 (up to 20 participants) + Zoom
Abstract:
In this talk I will discuss the variational form of Bayes theorem by Zellner (1988). This result is the rationale behind the variational (approximate) inference scheme, although it is not always that clear in modern presentations. I will discuss two applications of this results.
First, I will show how to do a low-rank mean correction within the INLA framework (with amazing results), which is essential for the next generation of the R-INLA software currently in development. In the second one, I will introduce the Bayesian learning rule, which unify many machine-learning algorithms from fields such as optimization, deep learning, and graphical models. This includes classical algorithms such as ridge regression, Newton's method, and Kalman filter, as well as modern deep-learning algorithms such as stochastic-gradient descent, RMSprop, and Dropout.
Reference:
The first part of the talk is based on our recent research at KAUST, while the second part is based upon \texttt{arxiv.org/abs/2107.04562} with Dr. Mohammad Emtiyaz Khan, RIKEN Center for AI Project, Tokyo.
Speaker:
https://www.kaust.edu.sa/en/study/faculty/haavard-rue