Event Date
Speaker: Mikhail Belkin, Professor, Halicioğlu Data Science Institute, UC San Diego
Title: The puzzle of dimensionality and feature learning in neural networks and kernel machines
Abstract: Remarkable progress in AI has far surpassed expectations of just a few years ago. At their core, modern models, such as transformers, implement traditional statistical models -- high order Markov chains. Nevertheless, it is not generally possible to estimate Markov models of that order given any possible amount of data. Therefore these methods must implicitly exploit low-dimensional structures present in data. Furthermore, these structures must be reflected in high-dimensional internal parameter spaces of the models. Thus, to build fundamental understanding of modern AI, itis necessary to identify and analyze these latent low-dimensional structures. In this talk, Prof. Belkin will discuss how deep neural networks of various architectures learn low-dimensional features and how the lessons of deep learning can be incorporated in non-backpropagation-based algorithms that we call Recursive Feature Machines. Prof. Belkin will provide a number of experimental results on different types of data, as well as some connections to classical sparse learning methods, such as Iteratively Reweighted Least Squares.
Speakers site (links to UC San Diego): https://datascience.ucsd.edu/people/mikhail-belkin/
Date: Thursday May 2nd, 2024 at 4:10pm
Location: REMOTE via Zoom (please contact pscully@ucdavis.edu for link)
Math Event Link: https://www.math.ucdavis.edu/research/seminars?talk_id=7142