I am a first year PhD student at Mila and Université de Montréal, DIRO under the supervision of Gauthier Gidel. Before my PhD, I studied Mathematics and Physics at Ecole Normale Supérieure, Paris and graduated from the Master Mathématiques de l'aléatoire, Orsay .

Research


My main interest is high-dimensional optimization, with a focus on stochastic gradient descent on simplified problems. The long term goal is to better understand the training dynamics of neural networks. My previous works mainly focused on loss landscape analysis, lottery ticket hypothesis, linear mode connectivity as well as generative models and alignement on human preferences.

Papers


Self-Consuming Generative Models with Curated Data Provably Optimize Human Preferences

Damien Ferbach, Quentin Bertrand, Joey Bose, Gauthier Gidel

NeurIPS 2024 (spotlight).

A theoretical study of generative models trained on synthetic data which is curated through human preferences.

Proving Linear Mode Connectivity of Neural Networks via Optimal Transport

Damien Ferbach, Baptiste Goujaud, Gauthier Gidel, Aymeric Dieuleveut

AISTATS 2024.

We show that wide two-layer networks trained with SGD or multi-layer networks with iid weights can be linked in parameter space by low loss paths modulo a permutation of the neurons.

A General Framework for Proving the Equivariant Strong Lottery Ticket Hypothesis

Damien Ferbach*, Christos Tsirigotis*, Gauthier Gidel, Joey Bose

ICLR 2023.

We study the existence of sparse subnetworks within overparametrized equivariant networks, that can approximate any smaller equivariant network.