jAER: software spike-event processing

Members: Aleksandrs Ecins, Ashley Kleinhans, Adam McLeod, Ching Teo, Francisco Barranco, Cornelia Fermuller, Jonathan Tapson, Michael Pfeiffer, Ryad Benjamin Benosman, Sergio Davies, Shih-Chii Liu, Tobi Delbruck, Terry Stewart

Organizer: Tobi Delbruck

In this workgroup you can learn about real time digital signal processing of address-event representation spike data using [ jAER open source software] and hardware for AER, including using a  silicon retina and  silicon cochlea.

 jAER (java AER, pronounced jay-er) allows processing the outputs of neuromorphic AER sensors and actuators by binding them with the convenience and low cost of PC computation.

This tutorial introduction will cover the basic software architecture of jAER and show by example how to use jAER with a silicon retina and cochlea and how to develop an event processing method. We plan to present jAER in three sessions and then leave the rest of the learning to the projects where it might be used.

Tutorial Material

  1. See attachment:"ABCs of jAER.pdf" Download for the basic introduction to jAER processing.
  2. See attachment:jAERSensorsTelluride2011.pdf Download for an introduction to event-based sensors.
  3. See attachment:JAER-762011.pdf Download for more notes on building an event filter, in this case the class  MedianTrackerTell2011.
  4. See the javadoc zip files attached below for JOGL (Java OpenGL) and for Java 6 itself. To add this javadoc in netbeans, choose Tools/Libraries, make a new library, name it e.g. JOGL, add the jogl.jar class archive for the classpath, and add the zip archive for the javadoc. For Java 6 documentation, select Tools/Java Platforms and add the java 6 javadoc archive from the attachments.

Developer rights to jAER project

If you develop code in one of the projects or will work with others

  1. Go to  https://sourceforge.net/user/registration
  2. Register at SourceForge with your true name please
  3. Click on the link in the confirmation email that you will receive to complete registration
  4. Send me your unix username
  5. I will then add you as a developer with commit rights


  1. Sign up for the subversion tutorial if you are not experienced using subversion.

The URL you need to use in subversion to browse or check out the jAER repository is  https://jaer.svn.sourceforge.net/svnroot/jaer/trunk .

If you will be doing a project using jAER,

  1. The full checkout of jAER from sourceforge takes a long time at the workshop because it is about 90MB and takes about 30 minutes at off-peak times. To speed up your install, download the zip file of the working copy checkout of jAER from the attachments below.
  2. Unzip the zip somewhere. You now have a subversion "working copy", but it may not be up to date.
  3. Install a subversion client if you are running windows (TortoiseSVN - see attachments below) or mac OS (Versions). See wiki:2011/svn11.
  4. You now should be able to quickly update your working copy. Do a subversion update on your working copy of jAER.
  5. You should be able to run jAER from the root of this extract. Try to run jAERViewer.exe (under windows) or jAERViewer.sh from linux or Mac OS X. If you get an error like Class not found, you may also need to install or update your java runtime; jAER needs at least java 1.6.
  6. Under Windows, you can get the JDK (Java development kit) and the netbeans IDE (integrated development environment) by installing the jdk-netbeans bundle attached below. For a different platform, go to  http://netbeans.org for netbeans installers. Choose the Jave SE (standard edition) download.

If you have trouble at some point, please ask Tobi Delbruck for help.

Drivers and Setup

Driver locations for the retina and cochlea: ...\jaer\drivers\windows\driverDVS_USBAERmini2\...

For the cochlea, build and run the project. Select: USB>Set default firmware for blank device

Click "Choose," navigate to ...\jaer\DeviceFirmwarePCBLayout\CypressFX2\Firmware_FX2LP_Cochleaams1b\firmwareFX2_Cochleaams1b.bix

To get started, once you build and run the project, select View>Biases

From there select File>Load Settings

The file select dialog box should be in ...\jaer\biasgenSettings

For the retina, select DVS128Fast.xml or DVS128Slow.xml For the Cochlea, select CochleaAMS1bBoard1.xml


  • Under mac OS, the default choice for the JRE (Java runtime environment) is Java 1.5; you need to change the platform used in netbeans. Ask Tobi Delbruck or 'vaibhav' for help here.
  • JOGL: The Java Open GL libraries used in jAER are part of the jAER checkout. But some of the native code JOGL modules for other platforms may not be up to date. We may need to refresh these to the latest JOGL.
  • See  http://jaer.wiki.sourceforge.net/jAER+installation for more tips about installing jAER. Do not download the jAER package from sourceforge - we are using subversion working copies for development.
  • The most frequent problem is that the native libraries such as those associated with JOGL are not found. The folder where the .dll's (windows), .so's (linux), .jnilib's (macosx) should be listed in order in the JVM environment variable java.library.path. There should only be one definition of java.library.path with all folders in sequence separated by the (system dependent) path separator: ";" (semicolon) under windows, ":" (colon) under linux. For example, under windows:
  • The above definition assumes the startup folder is java, not the root of jaer. The path to jars is then relative to the java startup folder.

Path and file separators

OSPath separatorFile separator
Windows; (semicolon)\ (backslash
MacOSX: (colon)/ (fwdslash
linux: (colon)/ (fwdslash

For background information:

  1. Read the paper Frame-free dynamic digital vision to see how spikes are processing in jAER from the  DVS silicon retina.
  2. Have a look at the  jAER wiki on SourceForge? which has lots of information about jAER.

Using matlab

[ As described on the jAER wiki], you can read logged event data into matlab or generate events in matlab to process with jAER.

You need to add the folder host/matlab from the jAER root to your matlab search path to use the loadaerdat.m and saveaerdat.m functions in matlab. These functions load and save raw address/timestamp data. To extract the actual pixel addresses you can use the functions in the various subfolders.

jAER devloper setup for other than netbeans

(For experts). Your jAER launcher needs to include something like the following set of switches to set up the JNI path and other options. These are normally part of your IDE's project options for launching classes.

-Djava.util.logging.config.file=conf/Logging.properties -Dsun.java2d.opengl=false  


See attachment:JAER-762011.pdf Download for notes on initial project discussions.

  1. Linux USB driver supporting bias control and silicon cochlea: Tobi Delbruck, 'roi' will work on the linux native USB driver to get it working better and will test on linux VM running on !MacOS X.
  2. Labyrinth game: Tobi Delbruck will continue work on the labyrinth robot (see below).
  3. Ego-motion estimation: 'cornelia', Sam Fok, Tobi Delbruck, Francesco Galluppi will work on implementing some form of ego-motion (camera motion) using local optical flow and Cornelia's ideas as published in papers attached below.
  4. Nengo interface to jAER: 'siddarth', Sam Fok, 'roi' will work on interfacing jAER to Nengo to use cochlea and retina in two projects. This interface will be based on UDP between jAER and Nengo in wiki:2011/nengo11 so that we can send live or recorded retina events directly to a running simulation of a network designed with the NEF. jAER already implements UDP output and input, so this project will mainly lie on the Nengo side. Existing python code in jaer in the host/python folder reads jaer output.
  5. Predicting sound from vision: Shih-Chii Liu, 'jon tapson' will work on predicting sound from optical flow in case of clapping person with visual distractors. Sample data distributed on the USB stick and attached below contains a sample of this data (JAERViewer-2011-05-06T16-11-00+0200 jon all scenarios.aeidx is the "index" file that will launch synchronized playback of DVS and AER-EAR recordings).
  6. Event-based processing implemented on FPGA platform: 'Ravi Shekar' will work on implementing some kind of event based processing on FPGA platform using retina input. Papers from the groups of Anton Civit and Alejandro Linares-Barranco on this approach will be attached below.
  7. Cochlea auditory feature filters: Shih-Chii Liu, 'roi', 'trevor' will work on implementing some auditory feature filters such as up/down swing from  AER-EAR silicon cochlea output.
  8. Arm tracking: Troy Lau, Michael Pfeiffer will work on arm tracking using DVS silicon retina.

Other potential projects of this practical tutorial are in topic areas, e.g. wiki:2011/cogrob11, wiki:2011/learn11, wiki:2011/att11.

More about Labyrinth


The Labyrinth game started at the  2011 CapoCaccia workshop. The  labyrinth game takes hundreds of hours of human effort to master yet is controlled using just two parameters (the tilt of the table). It requires fast and precise visual reactions and learned control to navigate the steel ball bearing through the treacherous maze. The plan is to build a robotic labyrinth player based around this  Brio labyrinth, which has 3 different mazes of increasing difficulty. We'll use the  DVS silicon retina or the ATIS, combined with highly developed trackers in  jAER to track the ball, and we already have a handy servo controller board interfaced to jAER that we will use to control servos to tilt the table. The video below shows the setup. Some  Danish students already did a lovely job on robotic Labyrinth and we will try to outdo them. The labyrinth game done in CapoCaccia? using the PID controller and cluster tracker is shown in the next video