Statistics Seminar: Spencer Frei

seminar thumbnail

Event Date

Location
Mathematical Sciences Building 1147

SPEAKER: Spencer Frei, Post-Doctoral Fellow, UC Berkeley

TITLE: "Statistical and computational phenomena in deep learning"

ABSTRACT: Deep learning's success has revealed a number of phenomena that appear to conflict with classical inuitions in the fields of optimization and statistics.  First, the objective functions formulated in deep learning are highly nonconvex but are typically amenable to minimization with first-order optimization methods like gradient descent.  And second, neural networks trained by gradient descent are capable of 'benign overfitting': they can achieve zero training error on noisy training data and simultaneously generalize well to unseen data.  In this talk we go over our recent work towards understanding these phenomena.  We show how the framework of proxy convexity allows for tractable optimization analysis despite nonconvexity, while the implicit regularization of gradient descent plays a key role in benign overfitting.   In closing, we discuss some of the questions that motivate our current work on understanding deep learning, and how we may use our insights to make deep learning more trustworthy, efficient, and powerful.

BIO: Spencer Frei is a postdoctoral fellow at UC Berkeley, working with Peter Bartlett and Bin Yu as a part of the NSF/Simons Collaboration on the Theoretical Foundations of Deep Learning. He is interested in understanding statistical and computational phenomena observed in deep learning. He was named a Rising Star in Machine Learning by the University of Maryland in 2022, and was a co-organizer of the 2022 Deep Learning Theory Workshop and Summer School at the Simons Institute for the Theory of Computing. He received his Ph.D in Statistics from UCLA in 2021 under the co-supervision of Quanquan Gu and Ying Nian Wu.

SEMINAR TIME/DATE: Monday, January 9, 11:00am

LOCATION: MSB 1147

 

Tags