Skip to content

xuzhiqin1990/dnn_simple_experiments

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DNN Simple Experiments

This code repository comprises a series of fundamental Deep Neural Network (DNN) experiments designed to assist you in comprehending and mastering the basic concepts of deep learning. Additionally, we have also verified two crucial phenomena: the Frequency Principle and the Condensation Phenomenon.

Frinciple Principle

DNNs often fit target functions from low to high frequencies.

The first figure shows the evolution of the function in spatial domain, the red line is the target function, and the blue line is the DNN output. Ordinate vs. Abscissa : y vs. x.

The second figure shows the evolution of the function in Fourier domain, the red line is the FFT of the target function, and the blue line is the FFT of DNN output. Ordinate vs. Abscissa: amplitude vs. frequency.

valuefreq

  1. DNN 1D

  2. DNN nD

Related papers

[1] Zhi-Qin John Xu* , Yaoyu Zhang, and Yanyang Xiao, Training behavior of deep neural network in frequency domain, arXiv preprint: 1807.01251, (2018), 26th International Conference on Neural Information Processing (ICONIP 2019). pdf and web

[2] Zhi-Qin John Xu* , Yaoyu Zhang, Tao Luo, Yanyang Xiao, Zheng Ma, Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks, arXiv preprint: 1901.06523, Communications in Computational Physics (CiCP). pdf and in web, some code is in github

[3] Zhi-Qin John Xu*, Yaoyu Zhang, Tao Luo, Overview frequency principle/spectral bias in deep learning. arxiv 2201.07395 (2022) . pdf, and in arxiv.

For more details, refer to Zhi-Qin John Xu's homepage

Condensation

Condense The above picture is the illstration of the condensation.

Related Papers

[1] Tao Luo#, Zhi-Qin John Xu #, Zheng Ma, Yaoyu Zhang*, Phase diagram for two-layer ReLU neural networks at infinite-width limit, arxiv 2007.07497 (2020), Journal of Machine Learning Research (2021) pdf, and in arxiv.

[2] Hanxu Zhou, Qixuan Zhou, Tao Luo, Yaoyu Zhang*, Zhi-Qin John Xu*, Towards Understanding the Condensation of Neural Networks at Initial Training. arxiv 2105.11686 (2021) pdf, and in arxiv, see slides and video talk in Chinese, NeurIPS2022.

For more details, see xuzhiqin condense

Contact Us

If you have any questions or suggestions, you can reach us via email at:

Zhi-Qin John Xu : [email protected]

Zhongwang Zhang : [email protected]

Zhangchen Zhou : [email protected]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published