In biology, the Vestibulo-Ocular Reflex produces eye movements that compensate for head movements, preserving the image on the center of the visual field. This project attempts to perform image stabilization using data obtained from an Inertial Measurement Units (IMU) attached to the eDVS silicon retina. We then perform optical flow estimation on the stabilized image to investigate the improvements obtained.


An eDVS silicon retina attached to a Pushbot was used for the experiment. A 9-dof (3 axis accelerometer, 3 axis gyroscope and 3 axis magnetometer) is present on the Pushbot (models unspecified). Due to the prototype hardware used, the sensors were not calibrated by the manufacturer. Hence, only the gyroscope was used in this experiment.

Spinning dots were presented to the eDVS as visual stimuli, while the Pushbot was rotated on the various axis. The gyroscope was sampled at 100Hz, while the eDVS camera was programmed to output visual events with 32-bit timestamps. All data was collected offline for analysis.

Rotational movement was obtained by integrating the gyroscope readings. Through linear interpolation, it is possible to compute the relative rotation between the initial position and the position when each visual event was received. These events are then mapped to its location in the visual field with respect to the initial orientation of the camera, thus achieving a stabilized image. Note that no units were specified for the gyroscope output. Instead, the scaling constants were obtained via trial-and-error.

A H-first based optical flow algorithm was run on the stabilized image to extract the direction and speed of the visual events. The algorithm has a resolution of 8 directions and 8 speeds.


Case 1: Horizontal compensation

Example animation of left-right movement

Case 2: Vertical compensation

Example animation

Case 3: Rotational compensation

Example animation of rotational compensation

Case 4: All axis rotation

All axis rotation

In the above examples, the left-right / up-down / rotational movements were compensated. In the composite image, the optical flow output was mapped back to match the un-stabilized raw input. Movement due to egomotion was significantly reduced when the optical flow algorithm was applied to the stabilized image, compared to the raw data.

Future work

It is apparent that the gyroscope alone was unable to compensate for translational movement. In addition, although the field of view was compensated, events corresponding to egomotion were still obtained. It may be more effective to have a mechanical implementation instead of a software solution.



1. Using the IMU system, re-map spikes to achieve image stabilization. (gyro_compensate.m).
2. Run Garrick's optical flow algorithm. (Request from Garrick if interested)
3. (optional) Remove artifacts by cancelling out very slow moving points. (filter_speed.m).
4. Re-map spikes back to their original position, mainly for visualization. (reconstruct.m)

Other utility files

1. Tool for capturing and controlling the pushbot. (Pushbot_1.zip).
2. Mex tool for parsing the offline data quickly. (parse_raw.m + parse_pushbot.c).
3. Visualization. (dvs_display_all.m).