2010/results/att10/MotionSaliency

Motion Saliency And Selection

This project aims to create a saliency map which responds selectively to motion of objects with a certain speed, direction and size. Direction, speed and size sensitive cells are found in MT (Simoncelli and Heeger, 2001). The selectivity of these cells can be described by their spatial and temporal frequency responses. We create spatiotemporal filters in software with a similar sensitivity to approximate these cells. The spatiotemporal filters are constructed using the formulation in "Etienne-Cummings, Van der Spiegel and Mueller".

The project can be extended to compute multiple saliency maps in parallel, each sensitive to objects of a different speed, size and/or direction. Competition between maps can be used to determine the size and image-plane velocity of an object or, if the size and speed of an object is already known, then the system can be tuned to these characteristics, allowing such an object to be quickly detected and located. The attached presentation give a brief description of the theory, equations and two example videos showing results.

References

Eero P. Simoncelli, David J. Heeger, A model of neuronal responses in visual area MT, Vision Research, Volume 38, Issue 5, March 1998, Pages 743-761, ISSN 0042-6989, DOI: 10.1016/S0042-6989(97)00183-1. ( http://www.sciencedirect.com/science/article/B6T0W-3WTP136-1Y/2/a99e2193022d112efa24e5808c0ecfa8)

Ralph Etienne-Cummings, Jan Van der Spiegel, and Paul Mueller. "Hardware Implementation of a Visual-Motion Pixel Using Oriented Spatiotemporal Neural Filters" Departmental Papers (ESE) (1999). Available at:  http://works.bepress.com/jan_vanderspiegel/1

Attachments