act12/results/OmniSpiNN

SpiNNaker, Nengo & Robot

Jorg Conradt, Chris Eliasmith, Francesco Galluppi, Terry Stewart

Goal of the Project: the goal of the project is to explore the possibilities of building networks in Nengo and running them on SpiNNaker on a robot, so to have an embodied autonomous robotic agent controlled by a SNN performing a task in the environment.

SpiNNaker System

Neural processing is performed by a 4-chip SpiNNaker board, offering a digital event-driven platform that can interpret incoming events and translate them into neural spike trains. Each SpiNNaker chip is equipped with 1Gbit SDRAM and 18 programmable ARM968 cores embedded in a configurable packet-switched asynchronous network-on-chip, based on an on-chip Multicast (MC) Router capable handling one-to-many communication of spikes (packets) very efficiently, and linked to 6 neighbour chips through asynchronous links.

The system is designed to scale up to 65536 chips (each consuming 1W maximum) and a million cores, offering a flexible, power-efficient platform for large-scale real-time modelling. Each SpiNNaker chip natively responds to events occurring in the network, and is therefore able to process information arriving from event based sensors (or send it to actuators) attached to its asynchronous links.

In this work we use a 4 chip SpiNNaker board, which overall has a top consumption of 5W: 4W chips + 1W infrastructure.

2D representation

We have expanded the representational powers of the NEF/SpiNNaker to 2D inputs/outputs. By doing so a large variety of models become available for exploration.

The most basic 2D dynamical model to be implemented is the oscillator (Cyclic Attractor). The following video shows how to build an oscillator in Nengo, compile it and run it on SpiNNaker, and visualizing the results back in Nengo.

The video above shows the level of integration in the Nengo/SpiNNaker toolchain by simulating a single oscillator (200 neurons) fed with a pulse at 5sec by a 2D communication channel.

The network is composed by 100 encoding neurons (Population A), 200 neurons of the oscillator stage (Population B), 100 neurons for the communication channel output population and 100 decoding neurons, where firing rates are unconstrained in the range 100-200 Hz. The model uses 5 cores, each modelling 100 neurons.


Return Home Behaviour in a robot

The network represented in the figure below has been used to control the robot, while switching in between two states: exploration and returning home.

network structure for the robot return home behaviour

The current state (behaviour) of the robot is represented in the state population, which acts as an integrator (working memory) which is able to maintain the current state over time with no additional input. The position integrator population maintains the position of the agent by 2D integrating the neural driving commands sent to the motors.

If the robot is in the explore state it can be driven using the commands in Nengo, which are then sent as 2D values to the explore population on the SpiNNaker board, which outputs the driving commands to the motors by passing through the laptop's wifi.

If the robot is in the return home state the driving commands are controlled by the return population, which tries to compensate the value in the position population by sending negative drive commands.

The robot is controlled by a SpiNNaker board connected to a laptop, sending and receiving messages to/from the robot/spiNNaker board through ethernet (cable+wifi).

The movie shows the robot performing the task as the state is switch from controlled exploration to autonomous return home behaviours. All computation is done in neurons.

The model uses 3000 neurons over 27 cores, firing 11.75 Mspikes/sec (~65Hz MFR).


Return Home Behaviour mediated by a model of Basal Ganglia

The model above is coupled with a Basal Ganglia model (described  here) for selecting the state. It is controlled manually but the idea was to have some external trigger (eg. a bumper sensor in the robot going off) to drive the activity of the basal ganglia and hence the

network description for the robot with the BG module

The video shows the model running on SpiNNaker and communicating with the Nengo interface. It can be noted that the position integrator is not so stable, and that leads the robot to return to the wrong starting position when the model is embodied.

[ See the model performing on the robot here]

The model is composed by 5000 neurons and 2.95M synapses on 49 cores. In this model the SpiNNaker board is for the first time mounted directly on top of the robot and controls its motor directly. Communication from Nengo to SpiNNaker (driving commands, visualizing neural responses) is done through wifi. SpiNNaker is therefore able to send and receive inputs/commands directly from/to the robot.


Integration with the Omnibot

robot integration

The mobile robot used in this project is a custom developed omni-directional mobile platform of 26cm diameter, with embedded low-level motor control and elementary sensory systems. An on-board ARM7 microcontroller for robot control receives desired motion commands in x and y direction and rotation through a UART communication interface, and continuously adapts three motor control signals (PD-control) to achieve the desired velocities. The robot’s integrated sensors include wheel encoders for position estimates, a 9 DOF inertial measurement unit (3xAccelerometer, 3xGyroscope, and 3xCompass) and a simple bump-sensor ring which triggers binary contact switches upon contact with objects in the environment. The integrated battery pack allows up to 8h of autonomous robot operation; powering robot and spiNNaker simultaneously we estimate an autonomous run time of about 4h.


SpiNNaker communicates with the robot through a small customized interface board with an ARM Cortext microcontroller that translates spiNNaker packages into robot commands and vice versa. The interface board is currently under improvement to allow higher data rates, such that event based sensory systems (such as silicon retinas or cochleae) can get interfaced directly to spiNNaker. The overall system is stand-alone autonomous (no PC in the loop).


Conclusions

The work in the project has built the basis for the integration of complex Nengo models onto robots controlled by SpiNNaker by testing various models. It also is the basis for the SpiNNaker/Robot/Nengo integration as shown in the Place Cells on Nengo on SpiNNaker on Omnibot and in the Sensing and Escape projects.


References


Attachments