[SFdS] Information du groupe BFA
WG Risk - Monday, June 17, 2019 - Dr. Francis BACH

Dear All,

We have the pleasure, thanks to the support of the ESSEC IDS dpt, Institut des Actuaires, LabEx MME-DII, and the group BFA (SFdS), to invite:

Dr. Fancis BACH
INRIA

who will speak on

On the global convergence of gradient descent
for non convex machine learning problems


Date and place: Monday, June 17, at 12:30 pm, EEE - ESSEC La Défense room 202 / at 6:30 pm, ESSEC Asia Pacific - Level 3, classroom 7

Abstract: Many tasks in machine learning and signal processing can be solved by minimizing a convex function of a measure. This includes sparse spikes deconvolution or training a neural network with a single hidden layer. For these problems, we study a simple minimization method: the unknown measure is discretized into a mixture of particles and a continuous-time gradient descent is performed on their weights and positions. This is an idealization of the usual way to train neural networks with a large hidden layer. We show that, when initialized correctly and in the many-particle limit, this gradient flow, although non-convex, converges to global minimizers. The proof involves Wasserstein gradient flows, a by-product of optimal transport theory. Numerical experiments show that this asymptotic behavior is already at play for a reasonable number of particles, even in high dimension. (Joint work with Lénaïc Chizat)

Kind regards,
Jeremy Heng, Olga Klopp, Marie Kratz, Isabelle Wattiau
http://crear.essec.edu/working-group-on-risk
SFdS - Société Française de Statistique
©2024 SFdS