The Spike-Based Cognitive Computing: Seeing, Hearing, and Thinking with Spikes Workgroup aka Spiking Hardware worked on any and all hardware projects incorporating spiking neurons. Two hardware platforms were brought as a part of the workgroup, the ELM chip and the TrueNorth chip. In addition, several attendees brought their own spiking sensors and other devices, ranging from FPGAs to robots. The group worked on an eclectic list of projects using hardware as well as projects to record data from spiking sensors in an effort to create datasets appropriate for neuromorphic hardware. Finally, the workgroup interacted significantly with the NLP workgroup, implementing related projects as well as supporting hardware.


Neural Networks for Natural Language Processing: using Word2Vec on TrueNorth

MNIST ATIS: Classify MNIST digits captured live with ATIS retinal camera connected to TrueNorth for processing and, trained with Caffe

Speech Recognition using Cochlea data?

Speech Recognition using Cochlea+ELM IC

Speech Recognition using Cochlea+TrueNorth: Sparse Representations on Spikes

Pedestrian Dataset: Acquire and label a spiking dataset with the DAVIS retinal camera

Spiking Sensor Interface to True North via UDP: ATIS, Davis, Cochlea, Sonar, Radar?

Lip Reading via Sensory Fusion: Acquired multi-modal dataset to improve speech recognition from silicon cochlea using data from a silicon retina (DVS or DAVIS sensor)

Localization Using Remote Sensing: Localization of a mobile robot by using a spiking neural network to classify signals from sonar and radar sensors

Finite State Machines (FSA) on TrueNorth: Finite State Machines

Mapping Neural Populations to Cores?