Skip to content

Latest commit

 

History

History
49 lines (49 loc) · 1.89 KB

2021-07-01-akiyama21a.md

File metadata and controls

49 lines (49 loc) · 1.89 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting
Deep learning empirically achieves high performance in many applications, but its training dynamics has not been fully understood theoretically. In this paper, we explore theoretical analysis on training two-layer ReLU neural networks in a teacher-student regression model, in which a student network learns an unknown teacher network through its outputs. We show that with a specific regularization and sufficient over-parameterization, the student network can identify the parameters of the teacher network with high probability via gradient descent with a norm dependent stepsize even though the objective function is highly non-convex. The key theoretical tool is the measure representation of the neural networks and a novel application of a dual certificate argument for sparse estimation on a measure space. We analyze the global minima and global convergence property in the measure space.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
akiyama21a
0
On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting
152
162
152-162
152
false
Akiyama, Shunta and Suzuki, Taiji
given family
Shunta
Akiyama
given family
Taiji
Suzuki
2021-07-01
Proceedings of the 38th International Conference on Machine Learning
139
inproceedings
date-parts
2021
7
1