Abstract
.One approach to invariant object recognition employs a recurrent
neural network as an associative memory. In the standard depiction
of
the network's state space, memories of objects are stored as
attractive fixed points of the dynamics. I argue for a modification
of this picture: if an object has a continuous family of
instantiations, it should be represented by a continuous attractor.
This idea is illustrated with a network that learns to complete
patterns. To perform the task of filling in missing information, the
network develops a continuous attractor that models the manifold from
which the patterns are drawn. From a statistical viewpoint, the
pattern completion task allows a formulation of unsupervised learning
in terms of regression rather than density estimation.
Users
Please
log in to take part in the discussion (add own reviews or comments).