- This event has passed.
The mathematics of neural networks: recent advances, thoughts, and the path forward (Prof. Mikhail Belkin, UCSD)
February 8 @ 4:15 pm - 5:30 pm
Title: The mathematics of neural networks: recent advances, thoughts, and the path forward
Speaker: Prof. Mikhail Belkin, Department of Mathematics, University of California San Diego
Abstract: The recent remarkable practical achievements of neural networks have far outpaced our theoretical understanding of their properties. Yet, it is hard to imagine that progress can continue indefinitely, without deeper understanding of their fundamental principles and limitations. In this talk I will discuss some recent advances in the mathematics of neural networks and outline what, in my opinion, are some promising directions for future research.
Mikhail Belkin received his Ph.D. in 2003 from the Department of Mathematics at the University of Chicago. His research interests are in theory and applications of machine learning and data analysis. Some of his well-known work includes widely used Laplacian Eigenmaps, Graph Regularization and Manifold Regularization algorithms, which brought ideas from classical differential geometry and spectral analysis to data science. His recent work has been concerned with understanding remarkable mathematical and statistical phenomena observed in deep learning. This empirical evidence necessitated revisiting some of the basic concepts in statistics and optimization. One of his key recent findings is the “double descent” risk curve that extends the textbook U-shaped bias-variance trade-off curve beyond the point of interpolation. Mikhail Belkin has served on the editorial boards of the Journal of Machine Learning Research, IEEE Pattern Analysis and Machine Intelligence and SIAM Journal on Mathematics of Data Science.