title | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting |
Deep learning empirically achieves high performance in many applications, but its training dynamics has not been fully understood theoretically. In this paper, we explore theoretical analysis on training two-layer ReLU neural networks in a teacher-student regression model, in which a student network learns an unknown teacher network through its outputs. We show that with a specific regularization and sufficient over-parameterization, the student network can identify the parameters of the teacher network with high probability via gradient descent with a norm dependent stepsize even though the objective function is highly non-convex. The key theoretical tool is the measure representation of the neural networks and a novel application of a dual certificate argument for sparse estimation on a measure space. We analyze the global minima and global convergence property in the measure space. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
akiyama21a |
0 |
On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting |
152 |
162 |
152-162 |
152 |
false |
Akiyama, Shunta and Suzuki, Taiji |
|
2021-07-01 |
Proceedings of the 38th International Conference on Machine Learning |
139 |
inproceedings |
|
|