Brain-Machine Interfacing

Members: Jorg Conradt, Daniel Lofaro, Michele Rucci, Ryad Benjamin Benosman, Steve Kalik, Timmer Horiuchi, Tobi Delbruck

Organized by 'Chuck Higgins' & 'Justin Sanchez'

See wiki:2010/results/bmi for results of this workgroup.


  1. 'Jerry Loeb', USC (sensorimotor control)
  2. 'Frances Richmond', USC
  3. Steve Temple, Georgia Tech
  4. 'Peter Brunner', Wadsworth Center (tutorials and experiments on EEG),
  5. 'Anirban "Nir" Dutta', Northwestern (tutorials and experiments on EMG)
  6. Gert Cauwenberghs, UCSD (non-contact EEG brain sensors)

Focus and Goals

This topic area will focus on the interface between living brains and artificial systems, both from a prosthetics perspective and from one of creating hybrid living/nonliving computing systems and robots. This research area is relevant to the workshop both because neuromorphic designs are ideally suited to provide electronic interfaces to bioelectric signals, and because of the key role that a detailed understanding of neural systems plays in the design of neuromorphic systems. Brain-machine interfacing offers tremendous promise as a new paradigm for human-machine interaction and as a vehicle for the discovery and promotion of new computational principles for autonomous and intelligent systems. By seamlessly coupling neural and artificial systems in closed-loop mode, one can study and test the computational principles of intelligent motor control and more importantly goal-directed behavior in a hybrid perception- action-reward cycle. Furthermore, discussion and possible demonstration of noninvasive human-wearable brain interfaces will also address the “cognitive” aspect of the workshop. The goals of this topic include:

  1. To inform participants of the state of the art in brain-machine interfacing
  2. To give participants a “hands-on” experience with live electrophysiology, and an appreciation for the complexity of bioelectric signals.
  3. To provide participants with an introduction to multielectrode array techniques.
  4. To provide tutorials on how to build closed-loop, real-time interfaces that utilize systems neurophysiology from large populations of neurons.


  1. Tactile sensor project - balancing on a unicycle with tactile biofeedback - 'Bruce Mortimer'
  2. Human-Machine interface: probing coupled behavior during a standing balance task - set a bicore oscillator to perturb ankle-joint during biped standing, study human-in-loop system behavior and explore powered prosthesis/orthosis applications - 'Anirban Dutta', 'Bruce Mortimer', 'MArk Tilden'
  3. Speech decode from facial EMG - 'Anirban Dutta', 'Spencer Kellis'
  4. Mind-Controlled RC Cars - Brandon Carroll, 'Mohsen Mollazadeh'
  5. EEG controlled omnidirectional robot with tactile feedback - 'Georgios Petrou', 'Spencer Kellis', 'Trushal Chokshi)]', '[[Person(Jonathan Dyhr' 'Peter Brunner' 'Bruce Mortimer'

Additional information

Insect-Machine interface : Drosophila-on-a-ball (design documents available at  http://openwiki.janelia.org/wiki/display/flyfizz/Drosophila-on-a-ball)