Article,

Deep Learning with Eigenvalue Decay Regularizer

.
ArXiv, (April 2016)cite arxiv:1604.06985.

Abstract

This paper extends our previous work on regularization of neural networks using Eigenvalue Decay by employing a soft approximation of the dominant eigenvalue in order to enable the calculation of its derivatives in relation to the synaptic weights, and therefore the application of back-propagation, which is a primary demand for deep learning. Moreover, we extend our previous theoretical analysis to deep neural networks and multiclass classification problems. Our method is implemented as an additional regularizer in Keras, a modular neural networks library written in Python, and evaluated in the benchmark data sets Reuters Newswire Topics Classification, IMDB database for binary sentiment classification and MNIST database of handwritten digits.

Tags

Users

  • @oswaldoludwig
  • @achakraborty

Comments and Reviews