A Stein variational Newton method

Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional grad...

Full description

Bibliographic Details
Main Authors: Marzouk, Youssef M, Spantini, Alessio
Other Authors: Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Format: Article
Language:English
Published: Curran Associates, Inc. 2020
Online Access:https://hdl.handle.net/1721.1/126535
Description
Summary:Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space. In this paper, we accelerate and generalize the SVGD algorithm by including second-order information, thereby approximating a Newton-like iteration in function space. We also show how second-order information can lead to more effective choices of kernel. We observe significant computational gains over the original SVGD algorithm in multiple test cases.