Abstract
In recent years, gradient-based LSTMrecurrent neural networks (RNNs)
solved many previously RNN-unlearnable tasks. Sometimes, however,
gradient
information is of little use for training RNNs, due to numerous local
minima. For such cases we present a novel method, namely, EVOlution
of systems with LINear Outputs (Evolino). Evolino evolves weights
to the
nonlinear, hidden nodes of RNNs while computing optimal linear mappings
from hidden state to output, using methods such as pseudo-inverse-based
linear regression. If we instead use quadratic programming to maximize
the margin, we obtain the first evolutionary recurrent Support Vector
Machines.
We show that Evolino-based LSTM can solve tasks that Echo State
nets 15 cannot, and achieves higher accuracy in certain continuous
function
generation tasks than conventional gradient descent RNNs, including
gradient-based LSTM.
Users
Please
log in to take part in the discussion (add own reviews or comments).