STA 290 Seminar: Yian Ma

Event Date

Location
Remotely Presented via Zoom

Speaker: Yian Ma, Google Research / UC San Diego

Title: Briding MCMC and Optimization

Abstract: In this talk, I will discuss three ingredients of optimization theory in the context of MCMC: Non-convexity, Acceleration, and Stochasticity.

I will focus on a class of non-convex objective functions arising from mixture models. For that class of objective functions, I will demonstrate that the computational complexity of a simple MCMC algorithm scales linearly with the model dimension, while optimization problems are NP-hard.

I will then study MCMC algorithms as optimization over the KL-divergence in the space of measures. By incorporating a momentum variable, I will discuss an algorithm which performs "accelerated gradient descent" over the KL-divergence. Using optimization-like ideas, a suitable Lyapunov function is constructed to prove that an accelerated convergence rate is obtained.

Finally, I will present a general recipe for constructing stochastic gradient  MCMC algorithms that translates the task of finding a valid sampler into one of choosing two matrices. I will then describe how stochastic gradient MCMC algorithms can be applied to applications involving sequential decision making, where the challenge arises from the ever-increasing size of datasets and accuracy requirement with a constant computation budget.    

 

Seminar Date/Time: Thursday April 30th, 4:10pm

This seminar will be delivered remotely via Zoom. To access the Zoom meeting for this seminar, please contact the instructor Shizhe Chen (szdchen@ucdavis.edu) or Pete Scully (pscully@ucdavis.edu) for the meeting ID and password, stating your affiliation.