Title: Visually stimulated motor control for a robot with a pair of LGMD visual neural networks
Authors: Shigang Yue; F. Claire Rind
Addresses: School of Computer Science, University of Lincoln, Brayford Pool, Lincoln, LN6 7TS, UK ' Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, NE1 7RU, UK
Abstract: In this paper, we proposed a visually stimulated motor control (VSMC) system for autonomous navigation of mobile robots. Inspired from a locusts' motion sensitive interneuron - lobula giant movement detector (LGMD), the presented VSMC system enables a robot exploring local paths or interacting with dynamic objects effectively using visual input only. The VSMC consists of a pair of LGMD visual neural networks and a simple motor command generator. Each LGMD processes images covering part of the wide field of view and extracts relevant visual cues. The outputs from the two LGMDs are compared and interpreted into executable motor commands directly. These motor commands are then executed by the robot's wheel control system in real-time to generate corresponded motion adjustment accordingly. Our experiments showed that this bio-inspired VSMC system worked well in different scenarios.
Keywords: visual neural networks; lobula giant movement detector; LGMD; locusts; mobile robots; visual stimulated motor control; visual stimulation; autonomous navigation; robot navigation; robot vision; robot control; wheel control; robot wheels; robot motion; bio-inspired computation.
DOI: 10.1504/IJAMECHS.2012.052219
International Journal of Advanced Mechatronic Systems, 2012 Vol.4 No.5/6, pp.237 - 247
Published online: 30 Aug 2014 *
Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article