Abstract
A computational model of hippocampal activity during spatial cognition
and navigation tasks is presented. The spatial representation in
our model of the rat hippocampus is built on-line during exploration
via two processing streams. An allothetic vision-based representation
is built by unsupervised Hebbian learning extracting spatio-temporal
properties of the environment from visual input. An idiothetic representation
is learned based on internal movement-related information provided
by path...
Users
Please
log in to take part in the discussion (add own reviews or comments).