Spatial Representation and Navigation in a Bio-inspired Robot

D. Sheynikhovich, R. Chavarriaga, T. Strösslin and W. Gerstner (2005)


Lecture Notes in Computer Science
Publisher: Springer-Verlag GmbH
ISSN: 0302-9743
Computer Science
Volume 3575 / 2005
Title:  Biomimetic Neural Learning for Intelligent Robots: Intelligent Systems, Cognitive Robotics, and Neuroscience
Editors:  Stefan Wermter, Günther Palm, Mark Elshaw
ISBN: 3-540-27440-5
DOI: 10.1007/b139051
Chapter: p. 245
DOI: 10.1007/11521082_15
Online Date: August 2005


A biologically inspired computational model of rodent repre-sentation?based (locale) navigation is presented. The model combines visual input in the form of realistic two dimensional grey-scale images and odometer signals to drive the firing of simulated place and head direction cells via Hebbian synapses. The space representation is built incrementally and on-line without any prior information about the environment and consists of a large population of location-sensitive units (place cells) with overlapping receptive fields. Goal navigation is performed using reinforcement learning in continuous state and action spaces, where the state space is represented by population activity of the place cells. The model is able to reproduce a number of behavioral and neuro-physiological data on rodents. Performance of the model was tested on both simulated and real mobile Khepera robots in a set of behavioral tasks and is comparable to the performance of animals in similar tasks.