2013/uns13

Universal Neuromorphic Devices and Sensors for Real-Time Mobile Robotics

Members: Anahita Mehta, Alejandro Pasciaroni, Jesus Armando Garcia Franco, Bilel Belhadj, Byron Galbraith, Luis Camunas, Cristian Axenie, Christian Denk, Jorg Conradt, Daniel Neil, Dimitra Emmanouilidou, Daniel Mendat, Daniel Rasmussen, Bert Shi, Francisco Barranco, Cornelia Fermuller, Greg Cohen, Garrick Orchard, Hector Jesus Cabrera Villaseor, Jonathan Tapson, Amir Khosrowshahi, Laxmi Iyer, Luis Plana, Michael Mathieu, Nicolai Waniek, Omid Kavehei, Michael Pfeiffer, Qian Liu, Ryad Benjamin Benosman, Ralph Etienne-Cummings, Shih-Chii Liu, Evangelos Stromatias, Steve Temple, Timmer Horiuchi, Tobi Delbruck, Will Constable, Xavier Lagorce, Zafeirios Fountas

Organizers: Jorg Conradt (TUM) Francesco Galluppi (University of Manchester) Shih-Chii Liu (University of Zurich, ETH Zurich) Ralph Etienne-Cummings (Johns Hopkins University)

Invitees: Tobi Delbruck (University of Zurich, ETH Zurich) Ryad Benjamin Benosman (UPMC) Ashwin Bellur (Univ. of Texas) Garrick Orchard (SINAPSE institute, Singapore) Luis Camunas (Institute of Microelectronics of Seville, Spain)

Aim and methods

To narrow the large gap between computational neuroscience modelers and system builders we will provide integrated neuromorphic hardware sensors and computing systems consisting of spiNNaker boards directly interfaced to neuromorphic sensors on-board of mobile robots. These systems are remotely accessible at various abstraction levels, ranging from remote control over programming in C to neuronal modeling (in PyNN, Nest, NEF/nengo, or similar neuronal description languages). We thus provide a framework for implementing and testing large neuronal models in real-time on mobile robots for everyone, without being too much impeded by a lack of detailed hardware or programming knowledge.

Projects

  • Event based computation and perception (sound localization, object recognition, sensory fusion)
  • SLAM
  • 3D tracking/synchronization using multiple DVS sensors
  • Motor control and planning
  • Neuromorphic sensors/robot/neural hardware integration

Access to Final Presentations restricted to participants. Please log in.

Technologies

  • Robotic Platforms: ground/flying robots
  • SpiNNaker 4 chip
  • Neuromorphic sensors (DVS, silicon cochlea, convolutional chips, ATIS??, ...?)

Resources

  • Software (PyNN, Nengo, SpiNNaker, etc.)
  • Tutorials

Application Scenario and Projects

Overview of robots for UNS group

A robotic application that requires complex low-latency perception, cognition and action-signals are indoor flying robots that shall explore available space, and locate (and possibly retrieve) a priory known objects.

We will provide customized quadrocopter platforms (type Parrot AR.Drone), which feature various neuromorphic sensors and carry a wireless low latency communication module. Remotely obtained sensor data is available on stationary computing systems, such as aVLSI chips, SpiNNaker boards, or standard computers for real-time data processing. Abstract motor commands (changes in roll, yaw, pitch, and altitude) can be sent in return to control the robots.

Results

Check the results page.

SpiNNaker

SpiNNaker is a novel computer architecture inspired by the working of the human brain. It is a digital, configurable multi-chip multi-core platform that can be programmed at different levels:

  • Unit Level: by using C with the SpiNNaker API to design custom neural and learning algorithms, or more generally to program single units of a parallel, event-driven, real-time system.
  • Network Level: by using PyNN to configure network topologies and neural parameters (see  https://capocaccia.ethz.ch/capo/wiki/2010/spinn10).
  • Functional level: encoding functions in neurons by using the Neural Engineering Framework, a formal method for mapping control-theoretic algorithms onto the neural connections between populations of spiking neurons (see  http://neuromorphs.net/nm/wiki/ng11/results/Spinnaker).

Omnibot with 2 silicon retinas and a 48-node SpiNNaker board 48-node SpiNNaker board (864 ARM cores @ 200 MHz, 6 GByte RAM)

Left: A 48-node SpiNNaker board, equipped with 864 ARM cores (running at 200 MHz) and 6 GByte RAM (128 MB x 48 chips)
Right: Autonomous omni-directional (holonomic) mobile robot with a 48-node SpiNNaker board, equipped with various sensors (e.g. stereo pair of silicon retinas, eDVS)

Attachments