A Fine-Grained Spectral Perspective on Neural Networks
G. Yang, und H. Salman. (2019)cite arxiv:1907.10599Comment: 12 pages of main text, 14 figures, 39 pages including appendix.
Zusammenfassung
Are neural networks biased toward simple functions? Does depth always help
learn more complex features? Is training the last layer of a network as good as
training all layers? These questions seem unrelated at face value, but in this
work we give all of them a common treatment from the spectral perspective. We
will study the spectra of the *Conjugate Kernel*, CK, (also called the *Neural
Network-Gaussian Process Kernel*), and the *Neural Tangent Kernel*, NTK.
Roughly, the CK and the NTK tell us respectively "what a network looks like at
initializationänd "what a network looks like during and after training." Their
spectra then encode valuable information about the initial distribution and the
training and generalization properties of neural networks. By analyzing the
eigenvalues, we lend novel insights into the questions put forth at the
beginning, and we verify these insights by extensive experiments of neural
networks. We believe the computational tools we develop here for analyzing the
spectra of CK and NTK serve as a solid foundation for future studies of deep
neural networks. We have open-sourced the code for it and for generating the
plots in this paper at github.com/thegregyang/NNspectra.
Beschreibung
[1907.10599] A Fine-Grained Spectral Perspective on Neural Networks
%0 Journal Article
%1 yang2019finegrained
%A Yang, Greg
%A Salman, Hadi
%D 2019
%K deep-learning gaussian-proceses kernels
%T A Fine-Grained Spectral Perspective on Neural Networks
%U http://arxiv.org/abs/1907.10599
%X Are neural networks biased toward simple functions? Does depth always help
learn more complex features? Is training the last layer of a network as good as
training all layers? These questions seem unrelated at face value, but in this
work we give all of them a common treatment from the spectral perspective. We
will study the spectra of the *Conjugate Kernel*, CK, (also called the *Neural
Network-Gaussian Process Kernel*), and the *Neural Tangent Kernel*, NTK.
Roughly, the CK and the NTK tell us respectively "what a network looks like at
initializationänd "what a network looks like during and after training." Their
spectra then encode valuable information about the initial distribution and the
training and generalization properties of neural networks. By analyzing the
eigenvalues, we lend novel insights into the questions put forth at the
beginning, and we verify these insights by extensive experiments of neural
networks. We believe the computational tools we develop here for analyzing the
spectra of CK and NTK serve as a solid foundation for future studies of deep
neural networks. We have open-sourced the code for it and for generating the
plots in this paper at github.com/thegregyang/NNspectra.
@article{yang2019finegrained,
abstract = {Are neural networks biased toward simple functions? Does depth always help
learn more complex features? Is training the last layer of a network as good as
training all layers? These questions seem unrelated at face value, but in this
work we give all of them a common treatment from the spectral perspective. We
will study the spectra of the *Conjugate Kernel*, CK, (also called the *Neural
Network-Gaussian Process Kernel*), and the *Neural Tangent Kernel*, NTK.
Roughly, the CK and the NTK tell us respectively "what a network looks like at
initialization"and "what a network looks like during and after training." Their
spectra then encode valuable information about the initial distribution and the
training and generalization properties of neural networks. By analyzing the
eigenvalues, we lend novel insights into the questions put forth at the
beginning, and we verify these insights by extensive experiments of neural
networks. We believe the computational tools we develop here for analyzing the
spectra of CK and NTK serve as a solid foundation for future studies of deep
neural networks. We have open-sourced the code for it and for generating the
plots in this paper at github.com/thegregyang/NNspectra.},
added-at = {2020-01-13T17:45:59.000+0100},
author = {Yang, Greg and Salman, Hadi},
biburl = {https://www.bibsonomy.org/bibtex/2d19b4b9b8b278ce8c78cb27c5660f617/kirk86},
description = {[1907.10599] A Fine-Grained Spectral Perspective on Neural Networks},
interhash = {874650c08b9e0c044ccdc0ab01320b28},
intrahash = {d19b4b9b8b278ce8c78cb27c5660f617},
keywords = {deep-learning gaussian-proceses kernels},
note = {cite arxiv:1907.10599Comment: 12 pages of main text, 14 figures, 39 pages including appendix},
timestamp = {2020-01-13T17:45:59.000+0100},
title = {A Fine-Grained Spectral Perspective on Neural Networks},
url = {http://arxiv.org/abs/1907.10599},
year = 2019
}