Invited talk at the PASCAL Workshop on Methods of Data Analysis in Computational Neuroscience and Brain Computer Interfaces, June 2007 (talk)
When considering Brain-Computer Interface (BCI) development for patients in the most severely paralysed states, there is considerable motivation to move away from BCI systems based on either motor cortex activity, or on visual stimuli. Together these account for most of current BCI research. I present the results of our recent exploration of new auditory- and tactile-stimulus-driven BCIs.
The talk includes a tutorial on the construction and interpretation of classifiers which extract spatio-temporal features from event-related potential data. The effects and implications of whitening are discussed, and preliminary results on the effectiveness of a low-rank constraint (Tomioka and Aihara 2007) are shown.
Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems