Human Cognition: Decoding Perceived, Attended, Imagined Acoustic Events and Human-Robot Interfaces

Members: Alain de Cheveign, Erik Billing, Chetan Singh Thakur, Daniel Neil, Dorothee Arzounian, David Karpul, Edmund Lalor, Estela Bicho, Greg Cohen, Giovanni Di Liberto, Guillaume Garreau, James OSullivan, John Foxe, Jonathan Tapson, Jessica Thompson, Lakshmi Krishnan, Malcolm Slaney, Marcela Mendoza, Manu Rastogi, Mark Wang, Ernst Niebur, Paul Verschure, Psyche Loui, Daniel Whittet, Ryad Benjamin Benosman, Sadique Sheik, Stephen Deiss, Shih-Chii Liu, siohoi ieng, Simon Kelly, Tony Lewis, Thomas Murray, Tobi Delbruck, Victor Benichoux, victor minces, Vikram Ramanarayanan, Andre van Schaik, Xavier Lagorce, Yves Boubenec, Yulia Sandamirskaya

Organizers:: Shihab Shamma (Univ. of Maryland) Malcolm Slaney (Microsoft) Barbara Shinn-Cunningham (Boston University) Edmund Lalor (Trinity College, Dublin)


This workgroup will look at priming. Priming is a more subtle effect where our expectations drive how we perceive the world around us. This priming might be based on our experiences and expectations over long periods of time, or it could be based on visual or auditory stimuli immediately before the current event. In either case, our expectations drive our perception and we would like to know more. Our primary tool again this year involves EEG measurements.  BrainVision has agreed to loan us the necessary EEG equipment again this year, and we are talking to them about which version of their equipment will best serve our purposes. This new task will allow us to investigate auditory cognition using both EEG signals, and simpler experiments based on psychoacoustic responses. We would like to know how visual and auditory stimuli affect our perception of sounds. Priming can happen at many different time scales---from long term expectations about speech, language and even music---to short-term contextual information that shapes our current environment and perception. We want to know how these expectations are formed, stored, and used to modify our perceptions.

This workgroup aims to measure neuronal signals that reflect the top-down state of an individual brain. Specifically, we seek to develop reliable on-line decoding algorithms that extract from the EEG signal the sensory-cortical responses corresponding to an auditory source amongst many in a complex scene, or to an expected music and speech signal. The goal is to understand how the perception and coding of such complex signals are represented and shaped by top-down cognitive functions (such as attention and recall). The basic scientific approaches needed are highly interdisciplinary, spanning development of signal-analysis algorithms and models of cortical function, to experimental EEG recordings during performance of challenging psychoacoustic tasks. While the center of our Telluride work will revolve around EEG measurements, we hope in this project to also investigate simple psychoacoustic experiments that will allow more people to participate in the overall project.

We have conducted pilot studies with a group of invited researchers gathered at the Telluride Neuromorphic Cognition Engineering Workshop in 2012 and 2013. This earlier work focused on demonstrating the feasibility of extracting the signals to which listeners attended in a complex mixture of sounds and studied imagined sounds. The 2012 work is described informally at  http://www.signalprocessingsociety.org/technical-committees/list/sl-tc/spl-nl/2012-11/TellurideNeuromorphicCognitionWorkshop/ and a full paper published in 2014:  James A. O’Sullivan, Alan J. Power, Nima Mesgarani, Siddharth Rajaram, John J. Foxe, Barbara G. Shinn-Cunningham, Malcolm Slaney, Shihab A. Shamma and Edmund C. Lalor . Attentional Selection in a Cocktail Party Environment Can Be Decoded from Single-Trial EEG. Cerebral Cortex, 2014.'

Potential Projects

Projects on Hold

EEG Hardware

We are grateful to  BrainVision for loaning us a 64-channel EEG recorder to perform our experiments. Our setup is documented here: Our EEG Setup

Final Presentation

Our final presentation is online now at  http://neuromorphs.net/nm/attachment/wiki/2014/hac14/HAC14%20Final%20Presentation%20Version%202.pptx