[Jul 23th 2018] On stochastic gradient descent, flatness and generalization – Seminar by Prof. Yoshua Bengio

Prof. Yoshua Bengio (University of Montreal, Department of Computer Science and Operations Research (DIRO) )

Jul 23, 2018 – 11:00 AM
DIISM, Artificial Intelligence laboratory (room 201), Siena SI
Description

The traditional Machine Learning picture is that optimization and generalization are neatly separated aspects. That makes theory easier to handle, separately, but unfortunately this is not the case. Stochastic Gradient Drescent (SGD) variants influence optimizatioon AND generalization. We analyze the effects on memorization in Deep Neural Networks, the relevance of loss function geometry for generalization, the factors influencing minima in SGD.

Bio

Yoshua Bengio received a PhD in Computer Science from McGill University, Canada in 1991.  After two post-doctoral years, one at M.I.T. with Michael Jordan and one at AT&T Bell Laboratories with Yann LeCun and Vladimir Vapnik, he became professor at the Department of Computer Science and Operations Research at Université de Montréal. He  is the author of two books and more than 200 publications, the most cited being in the areas of deep learning, recurrent neural networks, probabilistic learning algorithms, natural language processing and manifold learning. He is among the most cited Canadian computer scientists and is or has been associate editor of the top journals in machine learning and neural networks. Since ‘2000 he holds a Canada Research Chair in Statistical Learning Algorithms, since ‘2006 an NSERC Industrial Chair, since ‘2005 his is a Senior Fellow of the Canadian Institute for Advanced Research and since 2014 he co-directs its program focused on deep learning. He is on the board of the NIPS foundation and has been program chair and general chair for NIPS. He has co-organized the Learning Workshop for 14 years and co-created the new International Conference on Learning Representations. His current interests are centered around a quest for AI through machine learning, and include fundamental questions on deep learning and representation learning, the geometry of generalization in high-dimensional spaces, manifold learning, biologically inspired learning algorithms, and challenging applications of statistical machine learning.

 |  Category: News, Seminars