Spotlight Poster
Forward $\chi^2$ Divergence Based Variational Importance Sampling
Chengrui Li · Yule Wang · Weihan Li · Anqi Wu
Halle B
[
Abstract
]
[ Project Page ]
Spotlight
presentation:
—
[
Slides]
[
OpenReview]
Tue 7 May 7:30 a.m. PDT
— 9:30 a.m. PDT
—
Abstract:
Maximizing the marginal log-likelihood is a crucial aspect of learning latent variable models, and variational inference (VI) stands as the commonly adopted method. However, VI can encounter challenges in achieving a high marginal log-likelihood when dealing with complicated posterior distributions. In response to this limitation, we introduce a novel variational importance sampling (VIS) approach that directly estimates and maximizes the marginal log-likelihood. VIS leverages the optimal proposal distribution, achieved by minimizing the forward $\chi^2$ divergence, to enhance marginal log-likelihood estimation. We apply VIS to various popular latent variable models, including mixture models, variational auto-encoders, and partially observable generalized linear models. Results demonstrate that our approach consistently outperforms state-of-the-art baselines, in terms of both log-likelihood and model parameter estimation. Code: \url{https://github.com/JerrySoybean/vis}.
Chat is not available.