论文标题

引入块状toeplitz协方差矩阵,以重新制定与事件相关的潜在脑机构接口的线性判别分析

Introducing Block-Toeplitz Covariance Matrices to Remaster Linear Discriminant Analysis for Event-related Potential Brain-computer Interfaces

论文作者

Sosulski, Jan, Tangermann, Michael

论文摘要

嘈杂的多通道脑电图时间序列数据的协方差矩阵由于高维度而难以估计。在基于事件相关电位和分类的线性判别分析(LDA)的脑部计算机界面(BCI)中,解决此问题的最新技术是通过收缩正则化。我们提出了一个新颖的想法来解决此问题,通过为LDA的协方差矩阵实施一个块对eplitz结构,该结构在每个通道的短时间窗口中实现了信号平稳性的假设。根据13个与事件相关的潜在BCI协议收集的213名受试者的数据,与收缩正规化LDA(最多6个AUC点)和Riemannian分类方法相比,所得的“ Toeplitzlda”显着提高了二元分类性能(最多2个AUC点)。这可以大大提高应用程序级别的性能,如在无监督的Visual Speller应用程序中记录的数据所示,其中25名受试者的拼写错误平均可以减少81%。除了较低的记忆力和LDA培训的时间复杂性外,Toeplitzlda甚至几乎是不变的,即使是二十倍的时间维度扩大,这减少了有关特征提取的专家知识的需求。

Covariance matrices of noisy multichannel electroencephalogram time series data are hard to estimate due to high dimensionality. In brain-computer interfaces (BCI) based on event-related potentials and a linear discriminant analysis (LDA) for classification, the state of the art to address this problem is by shrinkage regularization. We propose a novel idea to tackle this problem by enforcing a block-Toeplitz structure for the covariance matrix of the LDA, which implements an assumption of signal stationarity in short time windows for each channel. On data of 213 subjects collected under 13 event-related potential BCI protocols, the resulting 'ToeplitzLDA' significantly increases the binary classification performance compared to shrinkage regularized LDA (up to 6 AUC points) and Riemannian classification approaches (up to 2 AUC points). This translates to greatly improved application level performances, as exemplified on data recorded during an unsupervised visual speller application, where spelling errors could be reduced by 81% on average for 25 subjects. Aside from lower memory and time complexity for LDA training, ToeplitzLDA proved to be almost invariant even to a twenty-fold time dimensionality enlargement, which reduces the need of expert knowledge regarding feature extraction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源