The Dynamics of Learning: A Random Matrix Approach

Zhenyu Liao 1 Romain Couillet 1, 2
2 GIPSA-CICS - CICS
GIPSA-DIS - Département Images et Signal
Abstract : Understanding the learning dynamics of neural networks is one of the key issues for the improvement of optimization algorithms as well as for the theoretical comprehension of why deep neu-ral nets work so well today. In this paper, we introduce a random matrix-based framework to analyze the learning dynamics of a single-layer linear network on a binary classification problem, for data of simultaneously large dimension and size, trained by gradient descent. Our results provide rich insights into common questions in neural nets, such as overfitting, early stopping and the initialization of training, thereby opening the door for future studies of more elaborate structures and models appearing in today's neural networks.
Document type :
Conference papers
Complete list of metadatas

Cited literature [27 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01957704
Contributor : Romain Couillet <>
Submitted on : Monday, December 17, 2018 - 3:06:37 PM
Last modification on : Wednesday, March 13, 2019 - 1:54:20 PM
Long-term archiving on : Monday, March 18, 2019 - 3:14:55 PM

File

liao18b.pdf
Publisher files allowed on an open archive

Identifiers

  • HAL Id : hal-01957704, version 1

Citation

Zhenyu Liao, Romain Couillet. The Dynamics of Learning: A Random Matrix Approach. International Conference on Machine Learning (ICML 2018), Jul 2018, Stockholm, Sweden. ⟨hal-01957704⟩

Share

Metrics

Record views

42

Files downloads

23